正在加载图片...
N(t)=n(t) Probability density FIGURE 73.3 A noise process. When the first-and second-order statistics do not change over time, we call the noise a weakly (or wide sense)stationary process. This means that: (1)EIN(o)]=u,=u is constant for all t and (2)RNN(t, t+ t) EIN(ON(t+ t))= EN(O)N(t)]= RNMt)for all t [see Brown, 1983, P. 82; Gardner, 1990, P. 108; or Peebles, 1987, P. 153 for properties of ron(t). In this case the autocorrelation function depends only on the offset T We assume hereafter that u =o(we can subtract u, which does not change the autocorrelation). When t=0, RNMO)= EN(ON(t+O)=E(N(t))=ON, which is the fixed variance of each random variable N, for all t Weakly stationary (ws) processes are the most commonly encountered cases and are the ones considered here Evolutionary processes have statistics that change over time and are difficult to analyze. Figure 73.3 shows a realization of a noise process N(o), where at any particular time t, the probability density function is shown coming out of the page in a third dimension. For a ws noise, the distributions are the same each t. The most mathematically tractable noises are Gaussian ws processes, where at each time t the probability distribution for the random variable N,=N(t)is Gaussian(also called norman. The first-and second-order statistics completely determine Gaussian distributions, and so ws makes their statistics of all orders stationary over time also. It is well known[see Brown, 1983, P. 39]that linear transformations of Gaussian andom variables are also Gaussian random variables. The probability density function for a Gaussian random variable N, is N(x)=(1/(2oN ]exp[-(x-HN)2/2oN], which is the familiar bell-shaped curve centered on HN. The standard Gaussian probability table [Peebles, 1987, P. 314] is useful, e.g., Pr[-N< N, ON) 2Pr[0< N,< On)=0.8413 from the table. The noise signal N(o) represents voltage, so the autocorrelation function at offset 0, RNMO)=EN(ON(oJ represents expected power in volts squared, or watts per ohm. When R=1 S, then N(O)N(n)=N(OIN(O/R N(oI(n)volt-amperes= watts(where I(t) is the current in a 1-2 resistor). The Fourier transform FIRNNT)] of the autocorrelation function RNmT)is the power spectrum, called the power spectral density function(psdf), SNN(w)in W/(rad/s). Then 广Rm(cmh=F() (73.20) RNN(t) SNN(w)edw= F-ISNN(w) e 2000 by CRC Press LLC© 2000 by CRC Press LLC When the first- and second-order statistics do not change over time, we call the noise a weakly (or wide￾sense) stationary process. This means that: (1) E[N(t)] = mt = m is constant for all t, and (2) RNN(t, t + t) = E[N(t)N(t + t)] = E[N(0)N(t)] = RNN(t) for all t [see Brown, 1983, p. 82; Gardner, 1990, p. 108; or Peebles, 1987, p. 153 for properties of RNN(t)]. In this case the autocorrelation function depends only on the offset t. We assume hereafter that m = 0 (we can subtract m, which does not change the autocorrelation). When t = 0, RNN(0) = E[N(t)N(t + 0)] = E[(N(t))2 ] = sN 2 , which is the fixed variance of each random variable Nt for all t. Weakly stationary (ws) processes are the most commonly encountered cases and are the ones considered here. Evolutionary processes have statistics that change over time and are difficult to analyze. Figure 73.3 shows a realization of a noise process N(t), where at any particular time t, the probability density function is shown coming out of the page in a third dimension. For a ws noise, the distributions are the same for each t. The most mathematically tractable noises are Gaussian ws processes, where at each time t the probability distribution for the random variable Nt = N(t) is Gaussian (also called normal). The first- and second-order statistics completely determine Gaussian distributions, and so ws makes their statistics of all orders stationary over time also.It is well known [see Brown, 1983, p. 39] that linear transformations of Gaussian random variables are also Gaussian random variables. The probability density function for a Gaussian random variable Nt is fN (x) = {1/[2psN 2]1/2exp[–(x – mN )2/2sN 2 ], which is the familiar bell-shaped curve centered on x = mN . The standard Gaussian probability table [Peebles, 1987, p. 314] is useful, e.g., Pr[–sN < Nt < sN ) = 2Pr[0 < Nt < sN ) = 0.8413 from the table. Noise Power The noise signal N(t) represents voltage, so the autocorrelation function at offset 0, RNN(0) = E[N(t)N(t)] represents expected power in volts squared, or watts per ohm. When R = 1 W, then N(t)N(t) = N(t)[N(t)/R] = N(t)I(t) volt-amperes = watts (where I(t) is the current in a 1-W resistor). The Fourier transform F[RNN(t)] of the autocorrelation function RNN(t) is the power spectrum, called the power spectral density function (psdf), SNN(w) in W/(rad/s). Then (73.20) FIGURE 73.3 A noise process. S w R e d R R S w e dw S w NN NN jws NN NN NN jws NN ( ) ( ) [ ( )] ( ) ( ) [ ( )] = = = = - -• • - -• • Ú Ú t t t t p F F 1 2 1
<<向上翻页向下翻页>>
©2008-现在 cucdc.com 高等教育资讯网 版权所有