正在加载图片...
CHAPTER 3 LEAST SQUARES METHODS FOR ESTIMATING B We often write Py=y. This is the part of y that is explained by X Properties of the matrices P and M ar (i)P= P, P=P(idempotent matrix ()M=M,M=M (iii PX=X, MX=0 (iv)PM=0 Xe=XMy If the first column of X is 1=(1,..., 1), this relation implies In addition,(iv) gives iy=yP'Py+yMMy=yy+e 3.2 Partitioned regression and partial regression Consider 3= XB+E=X161+ X2B2+E The normal equations for b1 and b2 are XIX1 X1X2b1 X1X2X2八(b2 X The first part of t hese equations are (XIX1b+(X,X2)b2= Xiy which give b1 XiX1XiX2b2 (X1X1)-X1(-X2b2) Plug this into the second part of the normal equations. Then, ha X, X161+X2 X2 b2 X2X1(X1X1)-X1y-X2X1(X1X1)-X1X2b2+2X2b2 X2X1(X1X1)Xly+X2(I-Px)X2b2CHAPTER 3 LEAST SQUARES METHODS FOR ESTIMATING β 2 We often write Py = y. ˆ This is the part of y that is explained by X. Properties of the matrices P and M are: (i) P ′ = P, P 2 = P (idempotent matrix) (ii) M′ = M, M2 = M (iii) PX = X, MX = 0 (iv) PM = 0 Using (1) and (iii), we have X ′ e = X ′My = 0. If the first column of X is 1 = (1, · · · , 1)′ , this relation implies X ′ 1 e = n i=1 ei = 0. In addition, (iv) gives y ′ y = y ′P ′P y + y ′M′My = ˆy ′ yˆ + e ′ e 3.2 Partitioned regression and partial regression Consider y = Xβ + ε = X1β1 + X2β2 + ε. The normal equations for b1 and b2 are X′ 1X1 X′ 1X2 X′ 2X1 X′ 2X2 b1 b2 = X′ 1 y X′ 2 y . The first part of these equations are (X ′ 1X1) b1 + (X ′ 1X2) b2 = X ′ 1y which gives b1 = (X ′ 1X1) −1 X ′ 1y − (X ′ 1X1) −1 X ′ 1X2b2 = (X ′ 1X1) −1 X ′ 1 (y − X2b2). Plug this into the second part of the normal equations. Then, we have X ′ 2X1b1 + X ′ 2X2b2 = X ′ 2X1 (X ′ 1X1) −1 X ′ 1y − X ′ 2X1 (X ′ 1X1) −1 X ′ 1X2b2 + X ′ 2X2b2 = X ′ 2X1 (X ′ 1X1) −1 X ′ 1 y + X ′ 2 (I − PX1 ) X2b2 = X ′ 2 y.
<<向上翻页向下翻页>>
©2008-现在 cucdc.com 高等教育资讯网 版权所有