正在加载图片...
2 MLE of a Gaussian AR(p) Process This section discusses a Gaussian AR(p) process Yt=c+o1rt-1+o2Yt-2+.+opYt-p+Et with Et w i.i.d. N(O, 02). In this case, the vector of population parameters to be estimated is 0=(c, 1, d2,,p, 02) 2.1 Evaluating the Likelihood Function We first collect the first p observation in the sample (Y1, Y2,.,Yp) in a(p x 1) vector yp which has mean vector f, with each element 1-1-0 and variance-covariance matrix is given by Tp-1p-2p-3 The density of the first p observations is then fy=1-,1(,-1,…,y:6)=(2)-/l2v 22p=2)V-1 (2)/(o-2P/Vp exp=2o2(yp-pp)vp(.) For the remaining observations in the sample (Yp+1, Yp+2,.,Yr), conditional n the first t-l observations, the tth observations is gaussian with mean c+1-1+2y-2+…+pyt-p2 MLE of a Gaussian AR(p) Process This section discusses a Gaussian AR(p) process, Yt = c + φ1Yt−1 + φ2Yt−2 + ... + φpYt−p + εt , with εt ∼ i.i.d. N(0, σ 2 ). In this case, the vector of population parameters to be estimated is θ = (c, φ1, φ2, ..., φp, σ 2 ) 0 . 2.1 Evaluating the Likelihood Function We first collect the first p observation in the sample (Y1, Y2, ..., Yp) in a (p × 1) vector yp which has mean vector µp with each element µ = c 1 − φ1 − φ2 − ... − φp and variance-covariance matrix is given by σ 2Vp =           γ0 γ1 γ2 . . . γp−1 γ1 γ0 γ1 . . . γp−2 γ2 γ1 γ0 . . . γp−3 . . . . . . . . . . . . . . . . . . . . . γp−1 γp−2 γp−3 . . . γ1           The density of the first p observations is then fYp,Yp−1,...,Y1 (yp, yp−1, ..., y1; θ) = (2π) −p/2 |σ −2V−1 p | 1/2 exp  − 1 2σ 2 (yp − µp ) 0V−1 p (yp − µp )  = (2π) −p/2 (σ −2 ) p/2 |V−1 p | 1/2 exp  − 1 2σ 2 (yp − µp ) 0V−1 p (y − µp )  For the remaining observations in the sample (Yp+1, Yp+2, ..., YT ), conditional on the first t − 1 observations, the tth observations is gaussian with mean c + φ1yt−1 + φ2yt−2 + ... + φpyt−p 8
<<向上翻页向下翻页>>
©2008-现在 cucdc.com 高等教育资讯网 版权所有