正在加载图片...
derives a test for heteroscedasticity which consists of comparing the elements of NSo(= 2ie'xixi)and s2(X'X)(=s22i-xixi), thus indicating whether or not the usual OLS formula s(X'X) is a consistent covariance estimator. Large discrepancies between NSo and s (X'X)support the contention of heteroscedas- ticity while small discrepancies support homoscedasticity A simple operational version of this test is carried out by obtaining NR2in the regression of e? on a constant and all unique variables in x@x. This statistics is asymptotically distributed as x2, where p is the number of regressors in the egression,including the constant. Exercise Reproduce the results of Example 11.3 at p 224 3 Weighted Least Squares When Q2 is Known Having tested for and found evidence of heteroscedasticity, the logical next is to revise the estimation technique to account for it. The gls estimator is B=(X9-1x)-1x-1Y Consider the most general case, o2=02wi. Then Q2-I is a diagonal matrix whose i-th diagonal element is 1/w;. The GLS is obtained by regressing PY= vN/√oN Applying Ols to the transformed model, we obtain the weighted least squares (WLS)estimator, =∑xx∑ i=1 where Wi=1/w;. The logic of the computation is that observations with smaller variances receive a large weight in the computations of the sums and thereforederives a test for heteroscedasticity which consists of comparing the elements of NS0(= PN i=1 e 2 i xix 0 i ) and s 2 (X0X)(= s 2 PN i=1 xix 0 i ), thus indicating whether or not the usual OLS formula s 2 (X0X) is a consistent covariance estimator. Large discrepancies between NS0 and s 2 (X0X) support the contention of heteroscedas￾ticity while small discrepancies support homoscedasticity. A simple operational version of this test is carried out by obtaining NR2 in the regression of e 2 i on a constant and all unique variables in x ⊗ x. This statistics is asymptotically distributed as χ 2 p , where p is the number of regressors in the regression, including the constant. Exercise: Reproduce the results of Example 11.3 at p.224. 3 Weighted Least Squares When Ω is Known Having tested for and found evidence of heteroscedasticity, the logical next is to revise the estimation technique to account for it. The GLS estimator is β˜ = (X0Ω −1X) −1X0Ω −1Y. Consider the most general case, σ 2 i = σ 2ωi . Then Ω −1 is a diagonal matrix whose i − th diagonal element is 1/ωi . The GLS is obtained by regressing PY =         y1/ √ ω1 y2/ √ ω2 . . . yN / √ ωN         on PX =         x1/ √ ω1 x2/ √ ω2 . . . xN / √ ωN         . Applying OLS to the transformed model, we obtain the weighted least squares (WLS) estimator, β˜ = "X N i=1 wixix 0 i #−1 "X N i=1 wixiyi # , where wi = 1/ωi . The logic of the computation is that observations with smaller variances receive a large weight in the computations of the sums and therefore 5
<<向上翻页向下翻页>>
©2008-现在 cucdc.com 高等教育资讯网 版权所有