LAW OF ITERATED EXPECTATIONS Law of Iterated Expectations Theorem 1 Law of iterated expectation.s E团=E[E[-] The notation Er[ indicates the expectation over the value of a Example 1 y=bx U0,1 E and a are independent EIy=E[bz + ea E[bx=]+ellul Since e and a are independent, EE=Ee=0, eyal Now, using the law of iterated expectations and E回=5 EIy= Ereli Er[b bEr al Matrix Algebra( Continue) Definition 1 Idempotent Matrit An idempotent matrit is one that is equal to its square, that is, M-= MM= M Example 2 The identity matric I Definition 2 Linear Depende A set of vectors is linearly dependent if any one of the vectors in the set can be written as a linear combination of the others
LAW OF ITERATED EXPECTATIONS 1 Law of Iterated Expectations Theorem 1 Law of Iterated Expectations E [y] = Ex [E [y|x]] The notation Ex [·] indicates the expectation over the value of x. Example 1 y = bx + ε ε ∼ N (0, 1) x ∼ U [0, 1] ε and x are independent. E [y|x] = E [bx + ε|x] = E [bx|x] + E [ε|x] Since ε and x are independent, E [ε|x] = E [ε] = 0, E [y|x] = bx Now, using the law of iterated expectations and E [x] = 1 2 E [y] = Ex [E [y|x]] = Ex [bx] = bEx [x] = 1 2 b Matrix Algebra (Continue) Definition 1 Idempotent Matrix An idempotent matrix is one that is equal to its square, that is, M2= MM = M. Example 2 The identity matrix I I 2 = I · I = I Definition 2 Linear Dependence A set of vectors is linearly dependent if any one of the vectors in the set can be written as a linear combination of the others
MATRIX ALGEBRA (CONTINUE) Example 3 2a+b 331-2 Example 4 Geometric Interpretation a1 and ug are independent a1 and ug are not independent Definition 3 Column Space The column space of a matri is the vector space that is spanned by its column vectors Example 5 spanned the r space Definition 4 Column rank mn rank of a matrit is the dimensio n of the vector space that is spanned by its columns Example 6 B 563 214 355
MATRIX ALGEBRA (CONTINUE) 2 Example 3 a = 1 2 , b = 3 3 , c = 10 14 2a + b− 1 2 c = 0 Example 4 Geometric Interpretation y x v1 v2 v1 and v2 are independent y x v1 v2 v1 and v2 are not independent Definition 3 Column Space The column space of a matrix is the vector space that is spanned by its column vectors. Example 5 A = 1 3 2 4 spanned the R2 space. Definition 4 Column Rank The column rank of a matrix is the dimension of the vector space that is spanned by its columns. Example 6 B = 1 2 3 2 5 1 5 7 6 4 5 7 3 1 4 1
M 120X[ G) B2](C+NI ONU)) spanned the l- space N56 355 spanned the l 3 Theorem 2 Equality of Row and Column Ranh The column rank and row rank of a matric are equal By the definition of row rank and its co unterpart for column rank the row space and column space of a matric have the same dimension Theorem 3 rank bab. min bran k ba. rank bB Theorem 4 For any matric a and nonsingular matrices B and C, the rank af bac is equal to the rank of a (The meaning of nonsingular matrices will be introduced later) Theorem 5 rank ba. t rank ba'A Definition 5 Determinant of a matric For an n matric(square matria), the area of the matric is the determinant det AT JA|x4×3-U×N Proposition 1 The determinant of a matric is nonzero if and only if it has full ranh rank ba. r dim ba Definition 6 Inverse of a matriT Suppo se that we could find a square matrit B such that Ba x I, B is the inverse of A, denoted Br A-I Example 8 n 3 Definition 7 Nonsingular Matric A matrit who se inverse exists is nonsingular
MATRIX ALGEBRA (CONTINUE) 3 spanned the R4 space C = 1 5 6 3 2 1 4 1 3 5 5 4 spanned the R 3 space Theorem 2 Equality of Row and Column Rank The column rank and row rank of a matrix are equal. By the definition of row rank and its counterpart for column rank, the row space and column space of a matrix have the same dimension. Theorem 3 rank (AB) ≤ min (rank (A), rank (B)) Theorem 4 For any matrix A and nonsingular matrices B and C, the rank of BAC is equal to the rank of A. (The meaning of nonsingular matrices will be introduced later). Theorem 5 rank (A) = rank (A′A) Definition 5 Determinant of a matrix For a n × n matrix (square matrix), the area of the matrix is the determinant. Example 7 A = 4 2 1 3 det A = |A| = 4 × 3 − 2 × 1 = 10 Proposition 1 The determinant of a matrix is nonzero if and only if it has full rank. rank (A) = dim (A) Definition 6 Inverse of a matrix Suppose that we could find a square matrix B such that BA = I, B is the inverse of A, denoted B = A−1 Example 8 A = 4 2 1 3 B = A−1 = 3 10 − 1 5 − 1 10 2 5 Definition 7 Nonsingular Matrix A matrix whose inverse exists is nonsingular.
MATRIX ALGEBRA (CONTINUE) Partitioned matrices 14 53622 Block diagonal matrix Addition and Multiplication Matrices B1 B A B B21 B A+B A11+B11A12+B A21+B21A22+B AB A11B11+A12B21A11B12+A12B A21B11+A22B B12+A22B Determinants of Partitioned matrices In general for a 2 x 2 partitioned matrix A11A1 A21A11A 1A24-A1-A12A2A For a block diagonal matrix A|·|B 0 B Inverses of Partitioned matrices In general for a 2x 2 partitioned matrix A11A1 All(I+A12F2A21A11-All A12F A21A2 F2Ac F Where F2=(A2-A2A1A1 For a block diagnoal matrix 0 0 0 A 0A2
MATRIX ALGEBRA (CONTINUE) 4 Partitioned Matrices A = 1 4 5 2 9 3 8 9 6 = A11 A12 A21 A22 Block diagonal matrix A = A11 0 0 A22 Addition and Multiplication Matrices A = A11 A12 A21 A22 , B = B11 B12 B21 B22 A + B = A11 + B11 A12 + B12 A21 + B21 A22 + B22 AB = A11B11 + A12B21 A11B12 + A12B22 A21B11 + A22B21 A21B12 + A22B22 Determinants of Partitioned Matrices In general for a 2 × 2 partitioned matrix A11 A12 A21 A22 = |A11| · A22 − A21A−1 11 A12 = |A22| · A11 − A12A−1 22 A21 For a block diagonal matrix A 0 0 B = |A| · |B| Inverses of Partitioned Matrices In general for a 2 × 2 partitioned matrix A11 A12 A21 A22 −1 = A−1 11 I + A12F2A21A−1 11 −A−1 11 A12F2 −F2A21A−1 11 F2 where F2 = A22 − A21A−1 11 A12−1 For a block diagnoal matrix A11 0 0 A22 −1 = A−1 11 0 0 A−1 22
MATRIX ALGEBRA (CONTINUE) Kronecker products Definition 8 For two matric, s 1 and EBth, ir Kron, chi, r product E a22E E anle an?E E Not,: For any matric 1 KxLBEmxn Bth, ir Kron, ct, r prode 1⑧E . s dim, n. n BKmc x Bnc. No conformability is r; quir, d Bx上Uple9 1 BE a g a g 1⑧E p n p d a g a g n p Trace of a matrix Definition 9 Th, trac, of a squar, K x K matrit is th, sum af its diagonal, l, m, nts Btrc=d k d k o k Btr lEc=d Slme e6t ctss trB'c=thB tr且4Ec=tr且c4trEc tr且Ec=trE1c tr B Ex pc=tr BEx plc=trEkplEc=tr Epl EX c
MATRIX ALGEBRA (CONTINUE) 5 Kronecker Products Definition 8 For two matrices A and B, their Kronecker product A ⊗ B = a11B a12B · · · a1KB a21B a22B · · · a2KB . . . . . . . . . an1B an2B · · · anKB Note: For any matrix AK×L, Bm×n, their Kronecter product A ⊗ B has dimension (Km) × (Ln). No conformability is required. Example 9 A = 1 2 3 4 , B = 5 7 6 8 A ⊗ B = 1 5 7 6 8 2 5 7 6 8 3 5 7 6 8 4 5 7 6 8 Trace of a Matrix Definition 9 The trace of a square K × K matrix is the sum of its diagonal elements: tr (A) = K i=1 aii. Example 10 A = 1 0 0 0 1 0 0 0 1 , tr (A) = 3 B = 1 2 3 2 1 2 3 2 1 , tr (B) = 3 Some identities tr (cA) = c · tr (A) tr (A′ ) = tr (A) tr (A + B) = tr (A) + tr (B) tr (Ik) = K tr (AB) = tr (BA) tr (ABCD) = tr (BCDA) = tr (CDAB) = tr (DABC)
MATRIX ALGEBRA (CONTINUE) The generalized inverse of a matrix Definition 10 A generalized inverse of a matric a is ano ther matric A+ that sat is fies LAATA=A 4. AA is summe Quadratic Forms and Definite Matrices For the opt imization problem =∑∑ The quadratic form can be written as Ax For a given matrix A 1. If x'Ax>( K, then A'A is positive definite and Aa is nonnegative definite If A is posit ive definite and B is a nonsingular matrix, then B'AB is positive definite If A is symmetric and idempotent, n xn with rank J, then every quadratic form in A can be written x Ax
MATRIX ALGEBRA (CONTINUE) 6 The Generalized Inverse of a Matrix Definition 10 A generalized inverse of a matrix A is another matrix A+ that satisfies 1. AA+A = A 2. A+AA+ = A+ 3. A+A is symmetric 4. AA+ is symmetric Quadratic Forms and Definite Matrices For the optimization problem q = n i=1 n j=1 xixjaij The quadratic form can be written as q = x ′Ax For a given matrix A, 1. If x ′Ax > ( K, then A′A is positive definite and AA′ is nonnegative definite. • If A is positive definite and B is a nonsingular matrix, then B′AB is positive definite. • If A is symmetric and idempotent, n × n with rank J, then every duadratic form in A can be written x ′Ax = J i=1 y 2 i