当前位置:高等教育资讯网  >  中国高校课件下载中心  >  大学文库  >  浏览文档

国立中山大学:《计量经济学》(英文版) Chapter 14 Stationary ARMA process

资源类别:文库,文档格式:PDF,文档页数:28,文件大小:177.79KB,团购合买
Ch. 14 Stationary ARMA Process a general linear stochastic model is described that suppose a time series to be generated by a linear aggregation of random shock. For practical representation it is desirable to employ models that use parameters parsimoniously. Parsimony may often be achieved by representation of the linear process in terms of a small number of autoregressive and moving
点击下载完整版文档(PDF)

Ch. 14 Stationary ARMA Process a general linear stochastic model is described that suppose a time series to be generated by a linear aggregation of random shock. For practical representation it is desirable to employ models that use parameters parsimoniously. Parsimony may often be achieved by representation of the linear process in terms of a small number of autoregressive and moving average terms. This chapter introduces univariate ARMA process, which provide a very useful class of models for de- scribing the dynamics of an individual time series. Throughout this chapter we assume the time index T to be T= ...-2,, 0, 1, 2,. 1 Moving Average Process 1.1 The First-Order Moving Average Process A stochastic process Yt, t ET is said to be a first order moving average process(MA(1) if it can be expressed in the form Yt=u+Et+BEt where u and 8 are constants and Et is a white-noise process Remember that a white noise process Et, tET) is that E(Et=0 and E(Etes) when t=s hent≠ 1.1.1 Check Stationarity The expectation of Yt is given by E(Y)=E(u+Et+OEt-1=u+E(Et)+0E(Et-1=u, for all tE T The variance of Yt E(Y-p)2=E(et+0e-1)2 E(e2+20c:et-1+62=21 =a2+0+02σ (1+02

Ch. 14 Stationary ARMA Process A general linear stochastic model is described that suppose a time series to be generated by a linear aggregation of random shock. For practical representation it is desirable to employ models that use parameters parsimoniously. Parsimony may often be achieved by representation of the linear process in terms of a small number of autoregressive and moving average terms. This chapter introduces univariate ARMA process, which provide a very useful class of models for de￾scribing the dynamics of an individual time series. Throughout this chapter we assume the time index T to be T = {... − 2, −1, 0, 1, 2, ...}, 1 Moving Average Process 1.1 The First-Order Moving Average Process A stochastic process {Yt , t ∈ T } is said to be a first order moving average process (MA(1)) if it can be expressed in the form Yt = µ + εt + θεt−1, where µ and θ are constants and εt is a white-noise process. Remember that a white noise process {εt , t ∈ T } is that E(εt) = 0 and E(εtεs) =  σ 2 when t = s 0 when t 6= s . 1.1.1 Check Stationarity The expectation of Yt is given by E(Yt) = E(µ + εt + θεt−1) = µ + E(εt) + θE(εt−1) = µ, for all t ∈ T . The variance of Yt is E(Yt − µ) 2 = E(εt + θεt−1) 2 = E(ε 2 t + 2θεtεt−1 + θ 2 ε 2 t−1 ) = σ 2 + 0 + θ 2 σ 2 = (1 + θ 2 )σ 2 . 1

The first autocovariance is E(Y-p(Yt-1-u)= E(Et+BEt-1(Et-1+8Et-2) E(EtEt-1+0E_1+0EtEt-2+62Et-1Et-2 0+602+0+0 Higher autocovariances are all zero D=E(Y-u)(Yi-i-u=E(Et+0Et-1(Et-i+8Et-j-1)=0 forj>1 Since the mean and the autocovariances are not functions of time, an MA(1) process is weakly-stationary regardless of the value of 6 1.1.2 Check Ergodicity It is clear that the condition ∑h=(1+62)+|21< is satisfied. Thus the MA(1) process is ergodic. 1.1.3 The Dependence Structure The jth autocorrelation of a weakly-stationary process is defined as its jth autocovariance divided by the variance By Cauchy-Schwarz inequality, we have Ir,l s 1 for all From above results, the autocorrelation of an MA(1) process is whenj=0 when 3 The autocorrelation r, can be plotted as a function of 3. This plot is usually called autocogram See the plots of p

The first autocovariance is E(Yt − µ)(Yt−1 − µ) = E(εt + θεt−1)(εt−1 + θεt−2) = E(εtεt−1 + θε 2 t−1 + θεtεt−2 + θ 2 εt−1εt−2) = 0 + θσ 2 + 0 + 0 = θσ 2 . Higher autocovariances are all zero: γj = E(Yt − µ)(Yt−j − µ) = E(εt + θεt−1)(εt−j + θεt−j−1) = 0 for j > 1. Since the mean and the autocovariances are not functions of time, an MA(1) process is weakly-stationary regardless of the value of θ. 1.1.2 Check Ergodicity It is clear that the condition X∞ j=0 |γj | = (1 + θ 2 ) + |θσ 2 | 1 . The autocorrelation rj can be plotted as a function of j. This plot is usually called autocogram. See the plots of p.50. 2

1.2 The q-th Order Moving Average Process A stochastic process Yt, tE TI is said to be a moving average process of order q(MA(q) if it can be expressed in this form Y=p+et+61et-1+62et-2+…+6=t-q where A, 01, 02,. 0a are constants and Et is a white-noise process 1.2.1 Check Stationarity The expectation of Yt is given by E(Y)=E(H+et+61et-1+62=-2+…+6=t-q) A+E(Et)+01E(Et-1)+B2E(Et-2)++0gE(Et-g)=u, for alltET The variance of Yt is 0=E(Y-p)2=E(t+01=-1+02=1-2+…+04=t-9)2 Since Ets are uncorrelated, the variance is 70=02+6m2+B2+…+2=(1+6++…+)2 Forj=1,2,…,q, =E[(Yt-1)(Y--p E[(et+61et-1+62t-2+…+6t-9)×(et-+61t-1-1+62et-1-2+…+64=t--9) E0=2-1+b+(=2=1-1+61+22=2-2+…+20-ye2- Terms involving Es at different dates have been dropped because their product has expectation zero, and bo is defined to be unity. For i>g, there are no as with common dates in the definition of %j, and so the expectation is zero. Thus, 3+6+161+63+22+…+b,04-2forj=1,2,…,q 0>9

1.2 The q-th Order Moving Average Process A stochastic process {Yt , t ∈ T } is said to be a moving average process of order q (MA(q)) if it can be expressed in this form Yt = µ + εt + θ1εt−1 + θ2εt−2 + ... + θqεt−q, where µ, θ1, θ2, ..., θq are constants and εt is a white-noise process. 1.2.1 Check Stationarity The expectation of Yt is given by E(Yt) = E(µ + εt + θ1εt−1 + θ2εt−2 + ... + θqεt−q) = µ + E(εt) + θ1E(εt−1) + θ2E(εt−2) + ... + θqE(εt−q) = µ, for all t ∈ T . The variance of Yt is γ0 = E(Yt − µ) 2 = E(εt + θ1εt−1 + θ2εt−2 + ... + θqεt−q) 2 . Since εt ’s are uncorrelated, the variance is γ0 = σ 2 + θ 2 1σ 2 + θ 2 2σ 2 + .... + θ 2 qσ 2 = (1 + θ 2 1 + θ 2 2 + ... + θ 2 q )σ 2 . For j = 1, 2, ..., q, γj = E[(Yt − µ)(Yt−j − µ)] = E[(εt + θ1εt−1 + θ2εt−2 + ... + θqεt−q) × (εt−j + θ1εt−j−1 + θ2εt−j−2 + ... + θqεt−j−q)] = E[θjε 2 t−j + θj+1θ1ε 2 t−j−1 + θj+2θ2ε 2 t−j−2 + .... + θqθq−jε 2 t−q ]. Terms involving ε’s at different dates have been dropped because their product has expectation zero, and θ0 is defined to be unity. For j > q, there are no ε’s with common dates in the definition of γj , and so the expectation is zero. Thus, γj =  [θj + θj+1θ1 + θj+2θ2 + .... + θqθq−j ]σ 2 for j = 1, 2, ..., q 0 for j > q . 3

For example, for an MA(2) process 1+61+2] 1+B261]a2 0 For any value of (01, 82,,0g), the MA(q) process is thus weakly-stationary 1.2.2 Check Ergodicity It is clear that the condition ∑hl<∞ satisfied. Thus the MA(g process is ergodic 1.2.3 The Dependence Structure The autocorrelation function is zero after q lags. See the plots of p 50 1.3 The Infinite-Order Moving Average Process A stochastic process Yt, tET) is said to be an infinite-order moving average process (MA(oo)) if it can be expressed in this form Y=+∑9=1=1+901+91-1+2-2+ are constants with o=l and Et is a white-noise process 1.3.1 Is This a Well Defined Random Sequence? A sequence pi lio is said to be square-summable if ∑y<∞

For example, for an MA(2) process, γ0 = [1 + θ 2 1 + θ 2 2 ]σ 2 γ1 = [θ1 + θ2θ1]σ 2 γ2 = [θ2]σ 2 γ3 = γ4 = .... = 0 For any value of (θ1, θ2, ..., θq), the MA(q) process is thus weakly-stationary. 1.2.2 Check Ergodicity It is clear that the condition X∞ j=0 |γj | < ∞ is satisfied. Thus the MA(q) process is ergodic. 1.2.3 The Dependence Structure The autocorrelation function is zero after q lags. See the plots of p.50. 1.3 The Infinite-Order Moving Average Process A stochastic process {Yt , t ∈ T } is said to be an infinite-order moving average process (MA(∞)) if it can be expressed in this form Yt = µ + X∞ j=0 ϕjεt−j = µ + ϕ0εt + ϕ1εt−1 + ϕ2εt−2 + ..... where µ, ϕ0, ϕ1, ϕ2, ..., are constants with ϕ0 = 1 and εt is a white-noise process. 1.3.1 Is This a Well Defined Random Sequence? A sequence {ϕj} ∞ j=0 is said to be square-summable if X∞ j=0 ϕ 2 j < ∞, 4

whereas a sequence i lg-o is said to be absolute-summable if pil0, there exists a suitably arge N such that for any integer M>N jeT In words, once N terms have been summed, the difference between that sum and the one obtained from summing to m is a random variable whose mean and variance are both arbitrarily close to zero Now the left hand side of (1)is simply EIPMEt-M+PM-lEt-M+1+.+PN+1Et-N-1 But if 2i=oP; oo, then by the Cauchy criterion the right side of (2)may be made as small as desired by a suitable large N. Thus the M A(oo)is well defined sequence since the infinity series 2i=0; converges in mean squares 1.3.2 Check Stationarity Assume the MA(oo) process to be with absolutely summable coefficients

whereas a sequence {ϕj} ∞ j=0 is said to be absolute-summable if X∞ j=0 |ϕj | 0, there exists a suitably large N such that for any integer M > N E "X M j=0 ϕjεt−j − X N j=0 ϕjεt−j #2 < ς. (1) In words, once N terms have been summed, the difference between that sum and the one obtained from summing to M is a random variable whose mean and variance are both arbitrarily close to zero. Now the left hand side of (1) is simply E [ϕMεt−M + ϕM−1εt−M+1 + .... + ϕN+1εt−N−1] 2 = (ϕ 2 M + ϕ 2 M−1 + ... + ϕ 2 N+1)σ 2 = "X M j=0 ϕ 2 j − X N j=0 ϕ 2 j #2 σ 2 . (2) But if P∞ j=0 ϕ 2 j < ∞, then by the Cauchy criterion the right side of (2) may be made as small as desired by a suitable large N. Thus the MA(∞) is well defined sequence since the infinity series P∞ j=0 ϕjεt−j converges in mean squares. 1.3.2 Check Stationarity Assume the MA(∞) process to be with absolutely summable coefficients. 5

The expectation of Yt is given by E(Yt)=limE(p+y0t+y1t-1+y2t-2+……+gret-r) 一 The variance of Yt is E(Y1-p)2 lim E(oEt +P1Et-1+92Et-2+..+PTEt-T im(y+92+2+…+1)o Fori>0 (Yt-p)(Y-) (9390+9+191+y+292+9+393+….)2 k=0 Thus, E(Y and %j are both finite and independent of t. The MA(oo) process with absolute-summable coefficients is weakly-stationary 1.3.3 Check Ergodicity Proposition The absolute summability of the moving average coefficients implies that the pro- Proof: Recall the autocovariance of an M A(o)is

The expectation of Yt is given by E(Yt) = lim T→∞ E(µ + ϕ0εt + ϕ1εt−1 + ϕ2εt−2 + .... + ϕT εt−T ) = µ The variance of Yt is γ0 = E(Yt − µ) 2 = lim T→∞ E(ϕ0εt + ϕ1εt−1 + ϕ2εt−2 + .... + ϕT εt−T ) 2 = lim T→∞ (ϕ 2 0 + ϕ 2 1 + ϕ 2 2 + .... + ϕ 2 T )σ 2 . For j > 0, γj = E(Yt − µ)(Yt−jµ) = (ϕjϕ0 + ϕj+1ϕ1 + ϕj+2ϕ2 + ϕj+3ϕ3 + ....)σ 2 = σ 2X∞ k=0 ϕj+kϕk. Thus, E(Yt) and γj are both finite and independent of t. The MA(∞) process with absolute-summable coefficients is weakly-stationary. 1.3.3 Check Ergodicity Proposition: The absolute summability of the moving average coefficients implies that the pro￾cess is ergodic. Proof: Recall the autocovariance of an MA(∞) is γj = σ 2X∞ k=0 ϕj+kϕk. 6

=0k=0 But there exists an M oo such that Zi=lpi li=o lpi+*l< M for k=0. 1. 2,..., meaning that h|<a2∑|pkM<m2Mf2 Hence, the MA(oo) process with absolute-summable coefficients is ergodic

Then |γj | = σ 2 X∞ k=0 ϕj+kϕk ≤ σ 2X∞ k=0 |ϕj+kϕk| , and X∞ j=0 |γj | ≤ σ 2X∞ j=0 X∞ k=0 |ϕj+kϕk| = σ 2X∞ j=0 X∞ k=0 |ϕj+k||ϕk| = σ 2X∞ k=0 |ϕk| X∞ j=0 |ϕj+k|. But there exists an M < ∞ such that P∞ j=0 |ϕj | < M, and therefore P∞ j=0 |ϕj+k| < M for k = 0, 1, 2, ..., meaning that X∞ j=0 |γj | < σ 2X∞ k=0 |ϕk|M < σ 2M2 < ∞. Hence, the MA(∞) process with absolute-summable coefficients is ergodic. 7

2 Autoregressive Process 2.1 The First-Order Autoregressive Process A stochastic process Yt, t E T is said to be a first order autoregressive process(AR(1)) if it can be expressed in the form Yt=c+ort where c and o are constants and Et is a white- noise process 2.1.1 Check Stationarity and Ergodicity Write the AR(1) process is lag operator form OLY+Et (1-OLY=c+Et In the case lo 1, we know from the properties of lag operator in last chapter (1-L)-1=1+oL+φ2L thi Y=(c+e)·(1+oL+φ2L2+…) (c+oLc+2L2c+…)+(et+oLet+2L2et+… (c+e+2c+…)+(et+cet-1+2et-2+…) +Et+ Et-1+Et-2+ 1 This can be viewed as an MA(oo)process with p; given by o. When I<1 this AR() is an MA(oo) with absolute summable coefficient =∑|P 1-1 < Therefore, the AR(1) process is stationary and ergodic provided that o <1

2 Autoregressive Process 2.1 The First-Order Autoregressive Process A stochastic process {Yt , t ∈ T } is said to be a first order autoregressive process (AR(1)) if it can be expressed in the form Yt = c + φYt−1 + εt , where c and φ are constants and εt is a white-noise process. 2.1.1 Check Stationarity and Ergodicity Write the AR(1) process is lag operator form: Yt = c + φLYt + εt , then (1 − φL)Yt = c + εt . In the case |φ| < 1, we know from the properties of lag operator in last chapter that (1 − φL) −1 = 1 + φL + φ 2L 2 + ...., thus Yt = (c + εt) · (1 + φL + φ 2L 2 + ....) = (c + φLc + φ 2L 2 c + ...) + (εt + φLεt + φ 2L 2 εt + ...) = (c + φc + φ 2 c + ...) + (εt + φεt−1 + φ 2 εt−2 + ...) = c 1 − φ + εt + φεt−1 + φ 2 εt−2 + ... This can be viewed as an MA(∞) process with ϕj given by φ j . When φ| < 1, this AR(1) is an MA(∞) with absolute summable coefficient: X∞ j=0 |ϕj | = X∞ j=0 |φ| j = 1 1 − |φ| < ∞. Therefore, the AR(1) process is stationary and ergodic provided that |φ| < 1. 8

2.1.2 The Dependence Structure The expectation of Yt is given by E(Y) = E( C et+et-1+a2et-2+… 1- The variance of Yt is E(Y-F E(et+et-1+a2et-2+…)2 (1+2+a4+…,lg2 1-φ For,>0 =E(Y-p)(Y--p) BI +2=-y++et-1-1++2t-y-2+…) (et-j+et--1+2et-j-2+…,) (1+02+4+… It follows that the autocorrelation function which follows a pattern of geometric decay as the plot on p 50 2.1.3 An Alternative Way to Calculate the Moments of a Stationary AR(1)PI Assume that the AR(1) process under consideration is weakly-stationary, then taking expectation on both side we have E(Y=c+oE(Y-1)+E(Et)

2.1.2 The Dependence Structure The expectation of Yt is given by E(Yt) = E( c 1 − φ + εt + φ 1 εt−1 + φ 2 εt−2 + ...) = c 1 − φ = µ. The variance of Yt is γ0 = E(Yt − µ) 2 = E(εt + φ 1 εt−1 + φ 2 εt−2 + ....) 2 = (1 + φ 2 + φ 4 + ....)σ 2 =  1 1 − φ 2  σ 2 . For j > 0, γj = E(Yt − µ)(Yt−j − µ) = E(εt + φ 1 εt−1 + φ 2 εt−2 + .... + φ j εt−j + φ j+1εt−j−1 + φ j+2εt−j−2 + ....) × (εt−j + φ 1 εt−j−1 + φ 2 εt−j−2 + ....) = (φ j + φ j+2φ j+4 + ...)σ 2 = φ j (1 + φ 2 + φ 4 + ....)σ 2 =  φ j 1 − φ2  σ 2 . It follows that the autocorrelation function rj = γj γ0 = φ j , which follows a pattern of geometric decay as the plot on p.50. 2.1.3 An Alternative Way to Calculate the Moments of a Stationary AR(1) Process Assume that the AR(1) process under consideration is weakly-stationary, then taking expectation on both side we have E(Yt) = c + φE(Yt−1) + E(εt). 9

Since by assumption that the process is stationary E(Y)=E(Y-1)=1 herefore =c++0 reproducing the earlier result To find a higher moments of Yt in an analogous manner, we rewrite this AR(1) Yt=1(1-)+Yt-1+Et (Yt-u=o(Yi-1-F)+Et For j20, multiply (Yi-i-u)on both side of (3)and take expectation E E(Y-1-1)(Y=1-1)+E(Y-1-1)et mj-1+ E(Y-,)Et Next we consider the term E(Yt-i-pEt When j=0, multiply Et on both side of (3) and take expectation E(Y1-p)t=E[0(Y1-1-)E+E(e2) Recall that Yt-1-u is a linear function of Et-1, Et-2, 62 E[0(Y1-1-1)2]=0

Since by assumption that the process is stationary, E(Yt) = E(Yt−1) = µ. Therefore, µ = c + φµ + 0 or µ = c 1 − φ , reproducing the earlier result. To find a higher moments of Yt in an analogous manner, we rewrite this AR(1) as Yt = µ(1 − φ) + φYt−1 + εt or (Yt − µ) = φ(Yt−1 − µ) + εt . (3) For j ≥ 0, multiply (Yt−j − µ) on both side of (3) and take expectation: γj = E[(Yt − µ)(Yt−j − µ)] = φE[(Yt−1 − µ)(Yt−j − µ)] + E(Yt−j − µ)εt = φγj−1 + E(Yt−j − µ)εt . Next we consider the term E(Yt−j − µ)εt . When j = 0, multiply εt on both side of (3) and take expectation: E(Yt − µ)εt = E[φ(Yt−1 − µ)εt ] + E(ε 2 t ). Recall that Yt−1 − µ is a linear function of εt−1, εt−2, ... : Yt−1 − µ = εt−1 + φεt−2 + φ 2 εt−3 + ..... we have E[φ(Yt−1 − µ)εt ] = 0. 10

点击下载完整版文档(PDF)VIP每日下载上限内不扣除下载券和下载次数;
按次数下载不扣除下载券;
24小时内重复下载只扣除一次;
顺序:VIP每日次数-->可用次数-->下载券;
共28页,试读已结束,阅读完整版请下载
相关文档

关于我们|帮助中心|下载说明|相关软件|意见反馈|联系我们

Copyright © 2008-现在 cucdc.com 高等教育资讯网 版权所有