正在加载图片...
CHAPTER 3 LEAST SQUARES METHODS FOR ESTIMATING B How well the regression line fits the dat a can be explained by P2 SSR bXMOXb SST 3Mo We call r coefficient of det ermination Remark 1 1: perfect fit Remark 2 s X: R for the regression of y on X and an additional wariable Z Rx: R for the regression of y on X Then RX=RX+(1-R2 where 02(22x+)(3y) R2 increases as the number of regressors increa ses whatever quality the additional regres soTs naue (ii)Theil's R2(ac R2=1 e'e/(n-k) y/M°y/(n-1) R2 will fall (rise) when the varia ble a is deleted from the regression if the t-ratio associated with this variable is greater(less) than 1 (iii) Information criteria AIC() +-( Akaike's information criteria. BIC(k)=In-+ k lnn (Bayesian information criteria) The sma ller, the betterCHAPTER 3 LEAST SQUARES METHODS FOR ESTIMATING β 5 How well the regression line fits the data can be explained by R 2 = SSR SST = b ′XM0Xb y ′M0y = 1 − e ′ e y ′M0y . We call R2 coefficient of determination. Remark 1 0 ≤ R 2 ≤ 1 0 : no fit 1 : perfect fit Remark 2 R2 Xz : R2 for the regression of y on X and an additional variable Z. R2 X : R2 for the regression of y on X. Then R 2 Xz = R 2 X + 1 − R 2 X r ∗2 yz where r ∗2 yz = (z ′ ∗ y∗) 2 (z ′ ∗ z∗) (y ′ ∗ y∗) , z∗ = (I − PX) z, y∗ = (I − PX) y. R2 increases as the number of regressors increases whatever quality the additional regres￾sors have. (ii) Theil’s R¯2 (adjusted R2 ) R¯2 = 1 − e ′ e/ (n − k) y ′M0y/ (n − 1) = 1 − n − 1 n − K 1 − R 2 R¯2 will fall (rise) when the variable x is deleted from the regression if the t—ratio associated with this variable is greater (less) than 1. (iii) Information criteria AIC (k) = ln e ′ e n + 2k n (Akaike’s information criteria) BIC (k) = ln e ′ e n + k ln n n (Bayesian information criteria) P C (k) = e ′ e n − k 1 + k n The smaller, the better.
<<向上翻页
©2008-现在 cucdc.com 高等教育资讯网 版权所有