REVIEW Review Conditional pdf Let(Y1,., YN) have joint pdf f(31,.. JN). Let f(3J+1, .. yN)be the marginal pdf of(y+1,……,YN). The conditional pdf of y1,…, Y, given y+1,…, YN is defined by JJ3J+1 f(+1,…3N) for f(3J+ yN)>0 Example 1 Let Y1 and Y2 be discrete r U. s with bivariate paf f (y1, 32) (m+v)/9,y=0,1,2and otherwise The f(3) 1-0(0,0)=份+=当社,m=012 The conditional paf of y2 gien y1=g is f(y2)-+2 f(32|)= 0,1 otherwise Example 2 Let Y1 and Y2 be continuo us T U.s with bivariate paf 2,0<<<1 f() m20=2(1-m),0<m<1 he f(g2v) f0<a<<1 otherwise (a fa
REVIEW 1 Review Conditional pdf Let (Y1, · · · , YN ) have joint pdf f (y1, · · · , yN ). Let f (yJ+1, · · · , yN ) be the marginal pdf of (YJ+1, · · · , YN ). The conditional pdf of Y1, · · · , YJ given YJ+1, · · · , YN is defined by f (y1, · · · yJ |yJ+1, · · · yN ) = f (y1, · · · , yJ+1, · · · , yN ) f (yJ+1, · · · yN ) for f (yJ+1, · · · , yN ) > 0. Example 1 Let Y1 and Y2 be discrete r.v.s with bivariate pdf f (y1, y2) = (y1 + y2) /9 , y1 = 0, 1, 2 and y2 = 0, 1 0 , otherwise. The marginal pdf of Y1 is f (y1) = 1 y2=0 f (y1, y2) = y1 9 + y1+1 9 = 2y1+1 9 , y1 = 0, 1, 2 0 , otherwise. The conditional pdf of Y2 given Y1 = y1 is f (y2|y1) = f(y1,y2) f(y1) = y1+y2 2y1+1 , y2 = 0, 1 0 , otherwise. Example 2 Let Y1 and Y2 be continuous r.v.s with bivariate pdf f (y1, y2) = 2 , if 0 < y1 < y2 < 1 0 , otherwise. Then f (y1) = 1 y1 2dy2 = 2 (1 − y1) , 0 < y1 < 1 0 , otherwise. f (y2|y1) = 1 1−y1 , if 0 < y1 < y2 < 1 0 , otherwise. (a function of y1)
REVIEW Conditional expectation Let Y=(Y1, .. YN) be an N-dimensional random variable, let g(r)be a given function, and let f(1, .. 3J13J+1, .. yN) be a given conditional probability function The conditional expect ation of g(Y), given YJ+1=3J+1 Eg(3)y+1=y1+1,……,YN=yN This is a nonstochastic function of y.+1,……,yN Eg(Y)Y+1,…,YN]=h(Y s arv. that takes on realized values E(Y)|}y+1=3+1,…,YN=y/N Example 3(cont inued f(,v) 2,f0<犰<<1 0 otherwise E(Y2Y=gn) 1-d +犰h (a fun of 31 1+Y E(Y21m1=-2(a random variable Expectation of g(r) is given by Eg(r)=Ey EYN EY,别Y+Y9(Y)1Y+1,…,YN Example 4(continued) E(Y2)=Er 1+Ey(Y1) 1 12(1-m)d
REVIEW 2 Conditional expectation Let Y = (Y1, · · · , YN ) ′ be an N−dimensional random variable, let g (Y ) be a given function, and let f (y1, · · · , yJ |yJ+1, · · · , yN ) be a given conditional probability function. The conditional expectation of g (Y ), given YJ+1 = yJ+1, · · · , YN = yN is E [g (y)|YJ+1 = yJ+1, · · · , YN = yN ] = ∞ −∞ · · · ∞ −∞ g (y1, · · · , yN ) f (y1, · · · , yJ |yJ+1, · · · , yN ) dy1 · · · dyJ . This is a nonstochastic function of yJ+1, · · · , yN . E [g (Y )|YJ+1, · · · , YN ] = h (YJ+1, · · · , YN ) is a r.v. that takes on realized values E [g (Y )|YJ+1 = yJ+1, · · · , YN = yN ] . Example 3 (continued) f (y1, y2) = 2 , if 0 < y1 < y2 < 1 0 , otherwise. Then E (Y2|Y1 = y1) = 1 y1 y2 1 1 − y1 dy2 = 1 1 − y1 1 2 1 − y 2 1 = 1 + y1 2 (a function of y1) and E (Y2|Y1) = 1 + Y1 2 (a random variable) Expectation of g (Y ) is given by Eg (Y ) = EYJ+1 , · · · , EYN EY1,···YJ |YJ+1,···YN [g (Y )|YJ+1, · · · , YN ] Example 4 (continued) E (Y2) = EY1 1 + Y1 2 = 1 2 (1 + EY1 (Y1)) = 1 2 1 + 1 0 y12 (1 − y1) dy1 = 1 2 1 + 1 3 = 2 3
REVIEW Stochastic convergences Markov' s inequality Assum-glyL≥ t for all y∈R Py IYL> ci ch and A2e ylg lyL ci Th-stat-d r-sult follows from this Chebyshevs inequality -t y b-ar v. with E lYLe u and Var lYLe o Pdy >1< PqY-Hl PLY -uL Ely Example 5 c with probability v n with probability n PdZn-c≥iePc Th
REVIEW 3 Stochastic convergences Markov’s inequality Assume g (y) ≥ 0 for all y ∈ R P [g (Y ) ≥ c] ≤ g [Y ] c . Proof. Assume that Y is a continuous r.v. with pdf f (·). Define A1 = {y|g (y) ≥ c} and A2 = {y|g (y) < c} . Then E [g (Y )] = A1 g (y) f (y) dy + A2 g (y) f (y) dy ≥ A1 g (y) f (y) dy ≥ A1 cf (y) dy = cP [g (Y ) ≥ c] . The stated result follows from this. Chebyshev’s inequality Let Y be a r.v. with E (Y ) = µ and V ar (Y ) = σ 2 . Then P [|Y − µ| ≥ r] ≤ σ 2 r 2 . Proof. P [|Y − µ| ≥ r] = P (Y − µ) 2 ≥ r 2 ≤ E (Y − µ) 2 r 2 = σ 2 r 2 . Example 5 Zn = c with probability 1 − 1 n n with probability 1 n P [|Zn − c| ≥ ε] = P [Zn = n] = 1 n → 0. Thus Zn P→ c
REVIEW Example 6 v wfh goby bLeP(znn2)ex→t+ So Buf E( Znl e t×(v em→∞+ yh, mpl, fog fh, afuf, m,.f fhyf M. h, 9g, N< pgobyb-l fy do, d. of amply m,y. dqugg, 1. h, 9g, NE
REVIEW 4 Example 6 Zn = 1 with probability 1 2 0 with probability 1 2 Z1, · · · , Zn are independent. Let Zn = 1 n n i=1 Zi P Zn − 1 2 > ε = P 1 n Zi − 1 2 > ε ≤ E 1 n Zi − 1 2 2 ε 2 = 1 n 2 n i=1 E Zi − 1 2 2 = 1 n 2 × n × 1 2 1 − 1 2 → 0. Thus, Zn P→ 1 2 . Example 7 Zn = 0 with probability 1 − 1 n n 2 with probability 1 n P (|Zn| > ε) = P Zn = n 2 = 1 n → 0. So Zn P→ 0. But E (Zn) = 0 × 1 − 1 n + n 2 × 1 n = n → ∞. So we have an example for the statement that convergence in probability does not imply mean square convergence.
REVIEW Example 8 0 with probability 1 with probabilit Defi Zn =l-z for all n n have the same distribution. Zn→Z 0 weber regardless of the vvalue of Z. So, Zn does not converge in p robability to Z (P(Zn-2|>)=P(|Zn-2=1)=1)
REVIEW 5 Example 8 Z = 0 with probability 1 2 1 with probability 1 2 Define Zn = 1 − Z for all n. Since all the Zn have the same distribution, Zn d→ Z. However, |Zn − Z| = |1 − 2Z| = 1 regardless of the value of Z. So, Zn does not converge in probability to Z. (P (|Zn − Z| > ε) = P (|Zn − Z| = 1) = 1)