正在加载图片...
Ch. 14 Stationary ARMA Process a general linear stochastic model is described that suppose a time series to be generated by a linear aggregation of random shock. For practical representation it is desirable to employ models that use parameters parsimoniously. Parsimony may often be achieved by representation of the linear process in terms of a small number of autoregressive and moving average terms. This chapter introduces univariate ARMA process, which provide a very useful class of models for de- scribing the dynamics of an individual time series. Throughout this chapter we assume the time index T to be T= ...-2,, 0, 1, 2,. 1 Moving Average Process 1.1 The First-Order Moving Average Process A stochastic process Yt, t ET is said to be a first order moving average process(MA(1) if it can be expressed in the form Yt=u+Et+BEt where u and 8 are constants and Et is a white-noise process Remember that a white noise process Et, tET) is that E(Et=0 and E(Etes) when t=s hent≠ 1.1.1 Check Stationarity The expectation of Yt is given by E(Y)=E(u+Et+OEt-1=u+E(Et)+0E(Et-1=u, for all tE T The variance of Yt E(Y-p)2=E(et+0e-1)2 E(e2+20c:et-1+62=21 =a2+0+02σ (1+02Ch. 14 Stationary ARMA Process A general linear stochastic model is described that suppose a time series to be generated by a linear aggregation of random shock. For practical representation it is desirable to employ models that use parameters parsimoniously. Parsimony may often be achieved by representation of the linear process in terms of a small number of autoregressive and moving average terms. This chapter introduces univariate ARMA process, which provide a very useful class of models for de￾scribing the dynamics of an individual time series. Throughout this chapter we assume the time index T to be T = {... − 2, −1, 0, 1, 2, ...}, 1 Moving Average Process 1.1 The First-Order Moving Average Process A stochastic process {Yt , t ∈ T } is said to be a first order moving average process (MA(1)) if it can be expressed in the form Yt = µ + εt + θεt−1, where µ and θ are constants and εt is a white-noise process. Remember that a white noise process {εt , t ∈ T } is that E(εt) = 0 and E(εtεs) =  σ 2 when t = s 0 when t 6= s . 1.1.1 Check Stationarity The expectation of Yt is given by E(Yt) = E(µ + εt + θεt−1) = µ + E(εt) + θE(εt−1) = µ, for all t ∈ T . The variance of Yt is E(Yt − µ) 2 = E(εt + θεt−1) 2 = E(ε 2 t + 2θεtεt−1 + θ 2 ε 2 t−1 ) = σ 2 + 0 + θ 2 σ 2 = (1 + θ 2 )σ 2 . 1
向下翻页>>
©2008-现在 cucdc.com 高等教育资讯网 版权所有