当前位置:高等教育资讯网  >  中国高校课件下载中心  >  大学文库  >  浏览文档

香港大学:《计量经济学》(英文版) Chapter 12 Time Series Analysis

资源类别:文库,文档格式:PDF,文档页数:9,文件大小:460.24KB,团购合买
Chapter 12 Time Series Analysis 12.1 Stochastic processes A stochastic process is a family of random variables {Xt,t ET}. Example{St,t 0, 1,2,...} where St i=o X; and iid(0,2). St has a different distribution at each point t.
点击下载完整版文档(PDF)

CHAPTER 12 TIME SERIES ANALYSIS Chapter 12 Time Series Analysis 12.1 Stochastic processes A stochastic process is a family of random variables Xt, tETy Example 1 [S, t=0, 1, 2, . where St= 2i=o Xi and X; N iid (0, 0). S has a different distribution at each point t 12.2 Stationarity and strict stationarity If Xt, t eT is a stochastic process such that Var(X,)<o for each E T, the autocor- variance funct ion y of xt is defined by (r, s)=Cov(Xr, Xs )=E(Xr- EXr)(Xs-EXs Because Var(Xt)<oo for eachtE T 2(,)≤[E(x-Ex]1E(x,-EX)2]12 by the Cauchy-Schwarz inequality The autocorrelation function P(r, s)is defined by (,s)=-2a(,s) √x(r,n)n(8,s) Example 2 Let Xt=e+Bet-1, et N iid(0, 02) + 72(t+h, t)=Cou(Xt+h,X)=8o h=±1 1 h=0 Pr(t+h, t) The time series{X,t∈ z with index set Z={0,±1,±2,…} is said to be( weakly) stationary, if 1.E|X<∞ for all t∈z 2.EXt= m for all t∈z 3.Y(r, s)=%(r+t, s+t) for all T, s, tE Z

CHAPTER 12 TIME SERIES ANALYSIS 1 Chapter 12 Time Series Analysis 12.1 Stochastic processes A stochastic process is a family of random variables {Xt ,t ∈ T} . Example 1 {St , t = 0, 1, 2, · · · } where St = t i=0 Xi and Xi ∼ iid (0, σ2 ). St has a different distribution at each point t. 12.2 Stationarity and strict strationarity If {Xt , t ∈ T} is a stochastic process such that V ar (Xt) 1 ρx (t + h, t)    = 1, h = 0 = θ (1+θ 2 ) , h = ±1 = 0, |h| > 1 The time series {Xt , t ∈ Z} with index set Z = {0, ±1, ±2, · · · } is said to be (weakly) stationary, if 1. E |X2 t | < ∞ for all t ∈ Z 2. EXt = m for all t ∈ Z 3. γx (r, s) = γx (r + t, s + t) for all r, s, t ∈ Z

CHAPTER 12 TIME SERIES ANALYSIS Remark 1 If IXt, tez is statio nary, then =(r, s)=%x(r x(r-s,0) Hence, we may define the autoco variance function of a statio nary process as a function of just one variable, which is the difference of two time inder. That is, instead of ya(T, s), we may write 7 ane way Pn(h)=7x2(h)/x2(0) Example 3 Xt=et+8et-l, et N iid(0, a2) Xt is stationary e nple 4 Xt=Xt-1+ et, et niid(0, 02) T ei+ xo Xt is not stationary, since Var(Xt)=to(assume Xo=0) 3 xample5X≡N(0,a2) X is not stationa The time series IXt, tezi is said to be strict ly stationary if the joint distribution of tk+h)are the same for all posit ive integers k and for all th,h∈Z. 12.3 Autoregressive processes 3t=C13-1+.+ ap3h-p+et: AR(p)process t=s EC=0 and Eees{=0,t≠

CHAPTER 12 TIME SERIES ANALYSIS 2 Remark 1 If {Xt ,t ∈ Z} is stationary, then γx (r, s) = γx (r − s, s − s) = γx (r − s, 0). Hence, we may define the autocovariance function of a stationary process as a function of just one variable, which is the difference of two time index. That is, instead of γx (r, s), we may write γx (r − s) = γx (h). To be more precise, γx (h) = Cov (Xt+h, Xt). In the same way, ρx (h) = γx (h) /γx (0). Example 3 Xt = et + θet−1, et ∼ iid (0, σ2 ). Xt is stationary. Example 4 Xt = Xt−1 + et , et ∼ iid (0, σ2 ). Then Xt = t i=1 ei + X0. Xt is not stationary, since V ar (Xt) = tσ2 (assume X0 = 0). Example 5 Xt ≡ N (0, σ2 t ). Xt is not stationary. The time series {Xt , t ∈ Z} is said to be strictly stationary if the joint distribution of (Xt1 , · · · , Xtk ) ′ and (Xt1+h, · · · , Xtk+h) ′ are the same for all positive integers k and for all t1, · · · , tk, h ∈ Z. 12.3 Autoregressive processes yt = α1yt−1 + · · · + αpyt−p + et : AR (p) process where Eet = 0 and Eetes = σ 2 , t = s = 0, t = s

CHAPTER 12 TIME SERIES ANALYSIS Or we may write using the lag operator (1-a1L 1)92ct Asymptotic theory of AR(1) mode et a COL 犹t-1犹t We have 1. Cols a 2.Vr(a-a)N(O1-a2)a Proof hen, using Cheby chev's inequality, we may obtain -1e 2 P 2. By the central limit theorem VF∑数=115 N(oni "p lim 2yi-I NOh Her Vr(a-a d N(ohl

CHAPTER 12 TIME SERIES ANALYSIS 3 Or we may write using the lag operator (1 − α1L − · · · − αpL p ) yt = et (L p yt = yt−p) Asymptotic theory of AR (1) model yt = αyt−1 + et . |α| < 1 αˆOLS = T t=2 yt−1yt  / T t=2 y 2 t−1 We have 1. αˆOLS p→ α 2. √ T (ˆα − α) d→ N (0, 1 − α 2 ). Proof. 1. αˆ − α = yt−1et/ y 2 t−1 . We may express yt−1 = ∞ i=0 α i et−1−i . Then, using Chebychev’s inequality, we may obtain yt−1et/T p→ 0 y 2 t−1 /T p→ σ 2 1 − α2 . 2. By the central limit theorem, 1 √ T yt−1et d→ N  0, σ2 p lim y 2 t−1 T  = N  0, σ 4 1 − α2  Hence √ T (ˆα − α) d→ N  0, 1 − α 2

CHAPTER 12 TIME SERIES ANALYSIS a fl (ah1)hN(0,1ha2) (ah 1)f nn-n ym Ir, nd mv,rble (See Fuller(1976),"Inr du re l thVe ryH ar、 h 1 hN(0,1) ∑ AR(P) 3t=01-1+tttfap3t-p+et, t=p+1,ttt Up+1=a1yp+ttt+ apy1+ep+1 yr=C13r-1+ ttt+ ap3t-pter Za+ yr-1 ttt Jr-p (yh za(yh za) COLS

CHAPTER 12 TIME SERIES ANALYSIS 4 When α = 1, we have 1. αˆ p→ 1. But 2. √ T (αˆ − 1) d N (0, 1 − α 2 ). In fact, T (ˆα − 1) d→ a non—normal random variable. (See Fuller (1976), “Introduction to Statistical Time Series”). As a result, for a t−test for H0 : α = 1, t = αˆ − 1  σˆ 2 y 2 t−1 −1 d N (0, 1). Least squares estimation of AR (p) processes yt = α1yt−1 + · · · + αpyt−p + et , t = p + 1, · · · , T    yp+1 = α1yp + · · · + αpy1 + ep+1 . . . yT = α1yT−1 + · · · + αpyt−p + eT or y = Xα + e where y =    yp+1 . . . yT   , X =    yp · · · y1 . . . yT −1 · · · yT −p   , α =    α1 . . . αp   , and e =    ep+1 . . . eT   . Then, αˆOLS = (X ′X) −1 X ′ y, σˆ 2 = (y − Xαˆ) ′ (y − Xαˆ) T − p . We can show that 1. αˆ p→ α

CHAPTER 12 TIME SERIES ANALYSIS 17 E if-/p=c/ g i)..-iAna=y Enw--way-] a./a. 3h i).a. iAnazy i] -a- all =w]f. /c-a=ac.if ic Aqua.iAn eh alzh a z-h ttth anzA w h li/nu. id/-/uni- ci=cl/ Example 6 Consider the AR arnprocess h d h The characteristic equation for this is eh zd hecz- w ae h h8Z nae h hrZn Hence, 3ht is stationary. We may also express ae h h.8Lnae h h.rLnyt w et which gives geh hLn ah. 8n (Impact of the event that happened long ago is negligible

CHAPTER 12 TIME SERIES ANALYSIS 5 2. √ T (αˆ − α) d→ N (0, Σ), where Σ = σ 2      γ0 γp−1 γ1 γ0 γp−2 . . . γp−1 γ0      −1 , γh = Eyt+hyt , if the process yt is stationary. Another way to state that yt is stationary is that all roots of the characteristic equation 1 − α1Z − α2Z 2 − · · · − αpZ p = 0 lie outside the unit circle. Example 6 Consider the AR (2) process yt − yt−1 + 0.16yt−2 = et . The characteristic equation for this is 1 − Z + 0.16Z 2 = (1 − 0.8Z) (1 − 0.2Z), which gives Z = 1 0.8 , 1 0.2 . Hence, yt is stationary. We may also express (1 − 0.8L) (1 − 0.2L) yt = et , which gives (1 − 0.8L) yt = ∞ i=0 (0.2)i et−i = ut and yt = ∞ i=0 (0.8)i ut−i . (Impact of the event that happened long ago is negligible)

ZHAPTER 12 TIME SERIES ANALYSIS LE. nc-7 C41 lieammxa AR TPEpr45a11 w-AFW_-1 R hPw__.o A-APz R HPZC1A-ZETA-HPZEC H al 5a, w ill 4r lpri4l pm. Alp pram, p51 TA- hElL e C( AR HPL R HPL2R…)et THFE wc>;R i=1 12. 4 Moving average processes C∑ T et N Ha(Hi )fodraura M, Va. crttrt Eb {e} Ce ruet-l- N MAIFE /. otcura do MA LEVa. crtt c er u R.R net C ARu1LR…RLF2o

CHAPTER 12 TIME SERIES ANALYSIS 6 Example 7 Consider the AR (2) process yt − 1.2yt−1 + 0.2yt−2 = et . 1 − 1.2Z + 0.2Z 2 = (1 − Z) (1 − 0.2Z) = 0 gives Z = 1, 1 0.2 . Hence, yt is not stationary. As a matter of fact, yt − yt−1 = (1 − 0.2L) −1 et =  1 + 0.2L + 0.2L 2 + · · · et = ∞ i=0 (0.2)i et−i = ut and yt = t i=1 ui + u0. 12.4 Moving average processes yt = M j=−M θjet−j where et ∼ iid  0, σ2 (finite order MA processes). Example 8 yt = et + θ1et−1 + θ2et−2 ∼ MA (2) Consider an MA (q) process yt = et + θ1et−1 + · · · + θqet−q = (1 + θ1L + · · · + θqL q ) et

d si aff n bnd ffib e b力 INIWMTPAINR Squ1f2ln }-612-HPbq2q{ T S TF/dS RSun4PENES, FES MA I NES/AEIn bSw MFISn 1 AIn 2finZES-IMSMAR I NESEA AlFRFRIP ∑的W班R刚<E WRSn Cet aeIn bs SxI/ASd 2n AIERI wly, wS EITYFES MA I NESAAArv SAF2b'S Example 9 t{eRet-1{小R The root of the equation FRI isz. Hence, this MA process is not invertible. In fact, RL {( v{}and∑|{ Example 10 gr et REfet-1 nf REflcet }R∈fz{∈ f L-FIP }R∈fL ∑吃 where <E R

CHAPTER 12 TIME SERIES ANALYSIS 7 If all roots of the equation 1 + θ1Z + · · · + θqZ q = 0 lie outside the unit circle, the MA process can be written as an infinite—order AR process, such that et = ∞ j=0 ψjyt−j with  ψj   1. et = 1 1 − 0.9L yt =  1 + 0.9L + 0.9 2L 2 + · · · yt = ∞ j=0 ψjyt−j , where ψj = 0.9 j and ∞ 0  ψj   = 1 1 − 0.9 < ∞

C, stoch ai or c ec hice s f s m ere HAL I PI2EIRSIM NMM A TEL TdSY Zt r tt ixp, p3 A T SE2b'SE TESnTaSIMSIAPAqullSAS/FL 1FMN, RIAFhS MYw 2g INISnr 2 m(2-dr N(KA-)p (SSS Fu 'SMC976 )) Remark 2 Estimating the coefficients of the M A processes is rather complicated, since 12.5 Autoregressi ve-moving average process tza1t1R… ROp tt p R1Rt1R…R1qtq~ABMA刘 gEL Ids INIYNTTFAINFES Squ1f2TnA r ez a-a1z-amz O PZH b亿EZAR1ZR.,ZqZH ES Tu/dS IRS un PENES.t 2A/F1 FTm1Ny Ind 2v SNF2b'S. WS RIvS vmxL-m团NTRE wESS RZi RUI RR r IPUt Z,txb er ztp

CHAPTER 12 TIME SERIES ANALYSIS 8 Asymptotic theory for MA (1) model yt = et + θet−1, |θ| < 1 (invertible) The nonlinear least squares estimator of θ has the following property: 1. ˆθ p→ θ 2. √ T  ˆθ − θ  d→ N  0, 1 − θ 2 . (See Fuller (1976)) Remark 2 Estimating the coefficients of the MA processes is rather complicated, since the problem is nonlinear. 12.5 Autoregressive—moving average process yt = α1yt−1 + · · · + αpyt−p + et + θet−1 + · · · + θqet−q ∼ ARMA (p, q) model et ∼ iid  0, σ2 If all roots of the equations a (Z) = 1 − α1Z − α2Z 2 − · · · − αpZ p = 0 and b (Z) = 1 + θ1Z + · · · θqZ q = 0 lie outside the unit circle, {yt} is stationary and invertible. We have √ T (ηNLLS − η) → N (0, V ) where η =   α1 . . . αp θ1 . . . θq   , V = σ 2 ! EU1U ′ 1 EU1V ′ 1 EV1U ′ 1 EV1V ′ 1 "−1 , a (L)Ut = et , b (L) Vt = et

CHAPTER 12 TIME SERIES ANALYSIS Example 11 X ar-ar 日r-bn where ti a,ers t-er Brvt-r x EU X t-a Ev X t-A →VX2 The process 3 is said to be an ARIMA. PxIxg Sprocess if t-LS yh is a statio naI ARMA pxg processes Example 12 S.t-LSyt Xeta<t gt is an ARIMAtxx Sprocess, i.e., differencing gt once yields ARMAtx sprocess Remark 3 The slo wly decay ing po sitive sample autocorrelation suggests the appropriate ness of an ARIMA model Remark 4 We need to difference ARIMA. pxlxg S process l-times in order to obtain a stationary process Remark 5 To see whether there is a unit root or not, we perform unit root tests. That we test h da xt in the model X where u is an ARM A process. Under the null, differencing once yield ARMA process

CHAPTER 12 TIME SERIES ANALYSIS 9 Example 11 yt = α1yt−1 + et + θet−1, et ∼ iid  0, σ2 √ T  αˆ1 − α1 ˆθ1 − θ1  d→ N (0, V ), where V = # (1 − α 2 1 ) −1 (1 + α1θ1) −1 (1 + α1θ1) −1  1 − θ 2 1 −1 $−1 Ut − α1Ut−1 = et , Vt − θ1Vt−1 = et ⇒ EU 2 t = σ 2 1 − α 2 1 EUtVt = σ 2 1 + α1θ1 EV 2 t = σ 2 1 − θ 2 1 ⇒ V = σ 2 # σ 2 1−α2 1 σ 2 1+α1θ1 σ 2 1+α1θ1 σ 2 1−θ 2 1 $ = # (1 − α 2 1 ) −1 (1 + α1θ1) −1 (1 + α1θ1) −1  1 − θ 2 1 −1 $ The process {yt} is said to be an ARIMA (p, d, q) process if (1 − L) d yt is a stationary ARMA (p, q) processes. Example 12 (1 − α) (1 − L) yt = et (|α| < 1) yt is an ARIMA (1, 1, 0) process, i.e., differencing yt once yields ARMA (1, 0) process. Remark 3 The slowly decaying positive sample autocorrelation suggests the appropriate￾ness of an ARIMA model. Remark 4 We need to difference ARIMA (p, d, q) process d−times in order to obtain a stationary process. Remark 5 To see whether there is a unit root or not, we perform unit root tests. That is, we test H0 : α = 1 in the model yt = αyt−1 + ut , where {ut} is an ARMA process. Under the null, differencing once yield ARMA process

点击下载完整版文档(PDF)VIP每日下载上限内不扣除下载券和下载次数;
按次数下载不扣除下载券;
24小时内重复下载只扣除一次;
顺序:VIP每日次数-->可用次数-->下载券;
已到末页,全文结束
相关文档

关于我们|帮助中心|下载说明|相关软件|意见反馈|联系我们

Copyright © 2008-现在 cucdc.com 高等教育资讯网 版权所有