正在加载图片...
Linearity of Expectation 3.1 Computing Expectation Using Indicators The proofs in this chapter are based on the following lemma: 3.1.1 Lemma.The expectation is a linear operator;i.e.,for any two random variables X,Y and constants a,BR: E[oX+BY]=0E[X]+BE[Y] Proof.E laX+BY]=(aX +BY)dP =a o X dP+8o Y dP aE[X]+ BE [Y]. This implies that the expectation of a sum of random variables X=X1+ X2+…+Xn is equal to EX]=EKl+E[K2+…+E[Xl This fact is elementary dence or independence 3.1.2 Definition (Indicator variables).For an event A,we define the indi- cator variable IA: ·IA(u)=1ifw∈A,and ·IA(u)=0ifwA. 3.1.3 Lemma.For any event A,we have E[IA]=P[A] Proof. EA]=IA()dP=dP=P[A]. In expectation of a variable can be by expresing itof indicator variable X=IA1+IA2+…+IAn3 Linearity of Expectation 3.1 Computing Expectation Using Indicators The proofs in this chapter are based on the following lemma: 3.1.1 Lemma. The expectation is a linear operator; i.e., for any two random variables X, Y and constants α, β ∈ R: E [αX + βY ] = αE [X] + βE [Y ] . Proof. E [αX + βY ] = R Ω (αX + βY ) dP = α R Ω X dP+β R Ω Y dP = αE [X]+ βE [Y ] . ✷ This implies that the expectation of a sum of random variables X = X1 + X2 + · · · + Xn is equal to E [X] = E [X1] + E [X2] + · · · + E [Xn] . This fact is elementary, yet powerful, since there is no restriction whatsoever on the properties of Xi , their dependence or independence. 3.1.2 Definition (Indicator variables). For an event A, we define the indi￾cator variable IA: • IA(ω) = 1 if ω ∈ A, and • IA(ω) = 0 if ω /∈ A. 3.1.3 Lemma. For any event A, we have E [IA] = P[A]. Proof. E [IA] = Z Ω IA(ω) dP = Z A dP = P[A] . ✷ In many cases, the expectation of a variable can be calculated by expressing it as a sum of indicator variables X = IA1 + IA2 + · · · + IAn
<<向上翻页向下翻页>>
©2008-现在 cucdc.com 高等教育资讯网 版权所有