正在加载图片...
CHAPTER 3 LEAST SQUARES METHODS FOR ESTIMATING B Chapter 3 Least Squares Methods for Estimating B Methods for estimat ing B Least squares estimation Maximum like lihood estimation Met hod of moments est imation Least a bsolute deviat ion est imation 3.1 Least squares estimation The criterion of the least squares estimation is (y-Xbo)(y-Xbo Let the ob jective funct ion be S(bo)=(y-Xbo(y-Xbo)=yy-boX'y-yXbo+60X'Xbo a/y-2/Xbo+ boX Xbo The first-order condition for the minimization of this funct ion is aS(bo) 2xy+2XXb0=0 The solution of this equat ion is the least squares estimate of the coefficient vector B b=(XXX If rank(X)=K, rank(X'x)=K. Thus, the inverse of X'X exists ete=y-Xb. We call this residual vect or. We have y Xb y-X(XXXy I-XX (I-P)y, where P=X(XXX. The matrix P is called projection matrix. We also let I-P M. Then, we may write(2)as b+e=Py+MgCHAPTER 3 LEAST SQUARES METHODS FOR ESTIMATING β 1 Chapter 3 Least Squares Methods for Estimating β Methods for estimating β Least squares estimation Maximum likelihood estimation Method of moments estimation Least absolute deviation estimation . . . 3.1 Least squares estimation The criterion of the least squares estimation is min b0 n i=1 (yi − X ′ i b0) 2 or min b0 (y − Xb0) ′ (y − Xb0). Let the objective function be S (b0) = (y − Xb0) ′ (y − Xb0) = y ′ y − b ′ 0X ′ y − y ′Xb0 + b ′ 0X ′Xb0 = y ′ y − 2y ′Xb0 + b0X ′Xb0. The first—order condition for the minimization of this function is ∂S (b0) ∂b0 = −2X ′ y + 2X ′Xb0 = 0. The solution of this equation is the least squares estimate of the coefficient vector β. b = (X ′X) −1 X ′ y. If rank (X) = K, rank (X′X) = K. Thus, the inverse of X′X exists. Let e = y − Xb. We call this residual vector. We have e = y − Xb (1) = y − X(X ′X) −1X ′ y = (I − X (X ′X) −1 X ′ )y = (I − P)y, (2) where P = X (X′X) −1 X′ . The matrix P is called projection matrix. We also let I −P = M. Then, we may write (2) as y = Xb + e = P y + My
向下翻页>>
©2008-现在 cucdc.com 高等教育资讯网 版权所有