当前位置:高等教育资讯网  >  中国高校课件下载中心  >  大学文库  >  浏览文档

《随机预算与调节》(英文版)Lecture 11 Last time: Ergodic processes

资源类别:文库,文档格式:PDF,文档页数:9,文件大小:361.43KB,团购合买
Last time: Ergodic processes An ergodic process is necessarily stationary. Example: Binary process
点击下载完整版文档(PDF)

16.322 Stochastic Estimation and Control, Fall 2004 Prof vander velde Lecture 11 Last time: Ergodic processes An ergodic process is necessarily stationary Example: Binary process t+t t+2T At each time step the signal may switch polarity or stay the same. Both +xo and -xo are equally likely Is it stationary and is it ergodic? For this distribution, we expect most of the members of the ensemble to have a hange point near t=0

16.322 Stochastic Estimation and Control, Fall 2004 Prof. Vander Velde Page 1 of 9 Lecture 11 Last time: Ergodic processes An ergodic process is necessarily stationary. Example: Binary process At each time step the signal may switch polarity or stay the same. Both 0 +x and 0 −x are equally likely. Is it stationary and is it ergodic? For this distribution, we expect most of the members of the ensemble to have a change point near t=0

16.322 Stochastic Estimation and Control, Fall 2004 Prof vander velde Rn(1,l2)=E[x(4)x(2) ≈x(1)x(2) 0 Rx(2,14)≈x(13) 14-13=12-1=x2 Chance of spanning a change point is the same over each regular interval, so the process is stationary. Is it ergodic Some ensemble members possess properties which are not representative of the ensemble as a whole. As an infinite set, the probability that any such member of the ensemble occurs is zero Page 2 of 9

16.322 Stochastic Estimation and Control, Fall 2004 Prof. Vander Velde Page 2 of 9 12 1 2 [ ] 1 2 2 34 3 2 43 21 0 (, ) ()( ) ()( ) 0 (, ) () xx xx R t t E xt xt xt xt R t t xt ttttx = ≈ = ≈ −=−= Chance of spanning a change point is the same over each regular interval, so the process is stationary. Is it ergodic? Some ensemble members possess properties which are not representative of the ensemble as a whole. As an infinite set, the probability that any such member of the ensemble occurs is zero

16.322 Stochastic Estimation and Control, Fall 2004 Prof vander velde x(t) t+t t+2T (t) 0 t+t t+2T t+T countable set of infinity which constitute a set of zero measure. The ey are a All of these exceptional points are associated with rational points. Th complementary set of processes are an uncountable infinity associated with irrational numbers which constitute a set of measure one

16.322 Stochastic Estimation and Control, Fall 2004 Prof. Vander Velde Page 3 of 9 All of these exceptional points are associated with rational points. They are a countable set of infinity which constitute a set of zero measure. The complementary set of processes are an uncountable infinity associated with irrational numbers which constitute a set of measure one

16.322 Stochastic Estimation and Control, Fall 2004 Prof. VanderⅤelde For ergodic processes E[x(o]=limx(dt x2=E[x()]=lim[x(dt R(r)=E[(ox(+r]=lim x(Ox(t+r)dt →2T R,()=E[(On(+D)]=lim ,T ]x(O)y (+rdr a time invariant system may be defined as one such that any translation in time le input affects the output only by an equal translation in time System This system will be considered time invariant if for every r, the input u(t+r) causes the output v((+r). note that the system may be either linear or non- linear It is proved directly that if u(t) is a stationary random process having the ergodic property and the system is time invariant, then y(() is a stationary random process having the ergodic property, in the steady state. This requires the system to be stable, so a defined steady state exists, and to have been operating in the presence of the input for all past time Example: Calculation of an autocorrelation function semble: x(0=Asin(at +0) B, A are independent random variable 0 is uniformly distributed over 0, 2T This process is stationary(the uniform distribution of 0 hints at this)but not ergodic. Unless we are certain of stationarity, we should calculate: Page 4 of 9

16.322 Stochastic Estimation and Control, Fall 2004 Prof. Vander Velde Page 4 of 9 For ergodic processes: [ ] [ ] [ ] 22 2 1 ( ) lim ( ) 2 1 [ ( ) ] lim ( ) 2 1 ( ) ( ) ( ) lim ( ) ( ) 2 1 ( ) ( ) ( ) lim ( ) ( ) 2 T T T T T T T xx T T T xy T T x E x t x t dt T x E x t x t dt T R E x t x t x t x t dt T R E x t y t x t y t dt T ττ τ τ τ τ →∞ − →∞ − →∞ − →∞ − = = = = = += + = += + ∫ ∫ ∫ ∫ A time invariant system may be defined as one such that any translation in time of the input affects the output only by an equal translation in time. This system will be considered time invariant if for every τ , the input u t( ) +τ causes the output y t( ) +τ . Note that the system may be either linear or non￾linear. It is proved directly that if u t( ) is a stationary random process having the ergodic property and the system is time invariant, then y t( ) is a stationary random process having the ergodic property, in the steady state. This requires the system to be stable, so a defined steady state exists, and to have been operating in the presence of the input for all past time. Example: Calculation of an autocorrelation function Ensemble: xt A t ( ) sin( ) = + ω θ θ, Aare independent random variables θ is uniformly distributed over 0,2π This process is stationary (the uniform distribution of θ hints at this) but not ergodic. Unless we are certain of stationarity, we should calculate:

16.322 Stochastic Estimation and Control, Fall 2004 Prof vander Velde Rx(12)=x(1)x(L2) ∫a4∫dea)fsi(omn+)si(mn+) 2 sin Asin B=5[cos(A-B)-cos(A+B) R2(42)=5∫,-o(-4)-coso(4+)+20]db l2-1 So the autocorrelation function is sinusoidal with the same frequenc This periodic property is true in general. If all members of a stationary ensemble are periodic, x((+nT)=x(o) R(t +nT)=x(ox(+r+nT) x(1)x(t+r) Identification of a periodic signal in noise We have recorded a signal from an experimental device which looks like just It is of interest to know if there are periodic components contained in it Conside x(1)=R(1)+P() where P(t) is any periodic function of period T and R(t)is a random process

16.322 Stochastic Estimation and Control, Fall 2004 Prof. Vander Velde Page 5 of 9 ( )( ) 12 1 2 2 2 1 2 0 0 "" "" (, ) ()( ) 1 ( ) sin sin 2 xx B A R t t xt xt dA d f a A t t π θ ωθ ωθ π ∞ = = ++ ∫ ∫ 14243 14243 ()() 1 sin sin cos cos 2 A B AB AB = −− + ⎡ ⎤ ⎣ ⎦ { } () () 2 2 12 2 1 1 2 0 2 2 1 1 1 ( , ) cos cos 2 2 2 1 cos , 2 Rxx tt A t t t t d A tt π ω ω θθ π ωτ τ = − − ++ ⎡ ⎤⎡ ⎤ ⎣ ⎦⎣ ⎦ = =− ∫ So the autocorrelation function is sinusoidal with the same frequency. This periodic property is true in general. If all members of a stationary ensemble are periodic, x( ) () t nT x t + = ( ) () ( ) () ( ) ( ) xx xx R nT x t x t nT xtxt R τ τ τ τ + = ++ = + = Identification of a periodic signal in noise We have recorded a signal from an experimental device which looks like just hash. It is of interest to know if there are periodic components contained in it. Consider: x() () () t Rt Pt = + where P t( ) is any periodic function of period T and R( )t is a random process independent of P

16.322 Stochastic Estimation and Control, Fall 2004 Prof vander Velde If R is a stationary random process, use t=42-4 R(t)=rgr(r)+Rep(r)+ Rep(t)+rpr(t) Rer(r)+Rpp(r)+2RP BoRIT) R(t) 2PR one frequency component, R(r) will contain the same component, ore than This usually makes the periodic component obvious. If P contains Note that this depends on P(r) being truly periodic, not just oscillatory Cohesion time(over which phase is maintained) must exceed correlation time of Signa Detection of a known signal in noise Communication systems depend on this technique for the detection of very weak of known form in strong noise. This is how the Lincoln Laboratory radar engineers"touched"Venus by radar, and the RLE last people touched the moon by laser A signal of known form is transmitted, s(t). Upon receipt it is badly corrupted with noise so that no recognizable waveform appears ceived message m(()=ks(t-4)+n(t)

16.322 Stochastic Estimation and Control, Fall 2004 Prof. Vander Velde Page 6 of 9 If R is a stationary random process, use 2 1 τ = t t − () () () () () () () 2 xx RR PP RP PR RR PP RRRRR R R RP τ ττττ τ τ =+++ =++ This usually makes the periodic component obvious. If P contains more than one frequency component, ( ) RPP τ will contain the same components. Note that this depends on P t( ) being truly periodic, not just oscillatory. Cohesion time (over which phase is maintained) must exceed correlation time of signal. Detection of a known signal in noise Communication systems depend on this technique for the detection of very weak signals of known form in strong noise. This is how the Lincoln Laboratory radar engineers “touched” Venus by radar, and the RLE last people touched the moon by laser. A signal of known form is transmitted, s t( ). Upon receipt it is badly corrupted with noise so that no recognizable waveform appears. Received message 1 m t ks t t n t () ( ) () = −+

16.322 Stochastic Estimation and Control, Fall 2004 Prof vander velde The received message is cross-correlated with a signal of the known waveform If the time of arrival is not known the cross-correlation is carried out for various values of t Rm(r)=s(t)m(t+r) =k()(t+z-1)+s()n(t+r) ks(o)s(t+T-4)+s(o)n(t+r) =kR(x-1) The signal is designed with zero mean to eliminate the second term. R(τ threshold The use of these correlation functions imply signals which continue for all time actually it is finite data in these cases and functions similar to the r functions are used which involve integration without averaging. However, the notions are analogous to these. This is the basis for correlation detection. The same result is obtained, starting from a different point of view, with the matched filter The also provides and estimate of k and of t. GPS uses the t/ estimate Page 7 of 9

16.322 Stochastic Estimation and Control, Fall 2004 Prof. Vander Velde Page 7 of 9 The received message is cross-correlated with a signal of the known waveform. If the time of arrival is not known, the cross-correlation is carried out for various values of τ . 1 1 1 ( ) () ( ) ()( ) () ( ) ()( ) () ( ) ( ) sm ss R stmt ks t s t t s t n t ks t s t t s t n t kR t τ τ τ τ τ τ τ = + = +− + + = +− + + = − The signal is designed with zero mean to eliminate the second term. The use of these correlation functions imply signals which continue for all time. Actually it is finite data in these cases and functions similar to the R functions are used which involve integration without averaging. However, the notions are analogous to these. This is the basis for correlation detection. The same result is obtained, starting from a different point of view, with the matched filter. The also provides and estimate of k and of t1. GPS uses the t1 estimate

16.322 Stochastic Estimation and Control, Fall 2004 Prof vander Velde Two-sided Fourier transform is used as it defines the behavior of the signal for negative time. This is important so that this can be set to zero for causal systems You know that use of transforms is convenient in the analysis of time- invariant linear systems. The same is true of the study of stationary processes in invariant linear systems sing the two-sided Fourier transform, the transform of the autocorrelation function is S(o)=R(r)e-o da This is called the power spectral density function We reject the real part of the argument of the exponential function, as this will diverge for negative time Properties of S(o) S(O)=R(T[cosOr-jsin or]dr R(r)cos ordr 2 R(r)cosordr We see that a PSD is a real function, even in @, and non-negative 0)e∈kevm Sx(O)≥0V S(o)=s( The inverse relation between s(o) and R(r)is S(oe/ c Note that E[x]=R(O) S(o)do

16.322 Stochastic Estimation and Control, Fall 2004 Prof. Vander Velde Page 8 of 9 Two-sided Fourier transform is used as it defines the behavior of the signal for negative time. This is important so that this can be set to zero for causal systems. You know that use of transforms is convenient in the analysis of time-invariant linear systems. The same is true of the study of stationary processes in invariant linear systems. Using the two-sided Fourier transform, the transform of the autocorrelation function is ( ) () j xx xx S R ed ωτ ω τ τ ∞ − −∞ = ∫ This is called the power spectral density function. We reject the real part of the argument of the exponential function, as this will diverge for negative time. Properties of ( ) xx S ω [ ] 0 ( ) ( ) cos sin ( )cos 2 ( )cos xx xx xx xx S R jd R d R d ω τ ωτ ωτ τ τ ωτ τ τ ωτ τ ∞ −∞ ∞ −∞ ∞ = − = = ∫ ∫ ∫ We see that a PSD is a real function, even in ω , and non-negative. ( ) Re () 0 ( ) () xx xx xx xx S S S S ω ω ω ω ω ω ∈ ∀ ≥ ∀ − = The inverse relation between ( ) xx s ω and ( ) Rxx τ is 1 () ( ) 2 j Rxx xx S ed ωτ τ ω ω π ∞ −∞ = ∫ Note that 2 ( ) (0) 1 ( ) 2 xx xx E xt R S d ω ω π ∞ −∞ ⎡ ⎤ = ⎣ ⎦ = ∫

16.322 Stochastic Estimation and Control, Fall 2004 Prof vander velde Power means mean squared value. The PSd gives the spectral distribution of power densit J( \)do= mean squared value of the frequency components in x(t)in the range a to o2 If x(o) has a non-zero mean, x(1)=x+r(1),r(1)=0 Rx()=x()x(+r) [x+r(+r) +r(T) Corresponding to x term, the PSD for x(t) will have an additive term

16.322 Stochastic Estimation and Control, Fall 2004 Prof. Vander Velde Page 9 of 9 Power means mean squared value. The PSD gives the spectral distribution of power density. 2 1 2 1 () ( ) xx yt S d ω ω ω ω π = ∫ = mean squared value of the frequency components in x( )t in the range ω1 to ω2 . If x( )t has a non-zero mean, [ ][ ] 2 ( ) ( ), ( ) 0 ( ) () ( ) () ( ) ( ) xx rr xt x rt rt R xtxt x rt x rt x R τ τ τ τ =+ = = + =+ + + = + Corresponding to 2 x term, the PSD for x( )t will have an additive term 2 2 2 () j xe d x ωτ τ π δω ∞ − −∞ = ∫

点击下载完整版文档(PDF)VIP每日下载上限内不扣除下载券和下载次数;
按次数下载不扣除下载券;
24小时内重复下载只扣除一次;
顺序:VIP每日次数-->可用次数-->下载券;
已到末页,全文结束
相关文档

关于我们|帮助中心|下载说明|相关软件|意见反馈|联系我们

Copyright © 2008-现在 cucdc.com 高等教育资讯网 版权所有