正在加载图片...
Finally, for testing hypothesis we can apply the full set of results in Chapter 6 to the transformed equation. For the testing the m restrictions RB g, the appropriate(one of) statistics is (RB-9[3R(PX(PX)-RT-(RB-q (RB-95R(XQ2-X)RT(RB-q Derive the other three test statistics (in Chapter 6) of the F-Ratio test statistics to test the hypothesis RB=q when 2+I 2.2 Maximum likelihood estimators Assume that e nN(0,o 0), if X are not stochastic, then by results from"func- tions of random variables"(n>n transformation)we have Y N(XB, 02Q2) That is, the log-likelihood function In f(e: Y In(2)-oIno2Q21 2 XB)(a292)-1(Y-XB) T In(2m)-oln In(2 XB)Q(Y-XB where 6=(B1, B2,.,Bk, 02)since by assumption 3 is known The necessary condition for maximizing L are x-(Y-XB)=0 (Y-XG)92-(Y-X) The solution are BML=(XQX-XQY (Y-XBML)2-Y-XBMLFinally, for testing hypothesis we can apply the full set of results in Chapter 6 to the transformed equation. For the testing the m restrictions Rβ = q, the appropriate (one of) statistics is (Rβ˜ − q) 0 [s˜ 2R(PX) 0 (PX) −1R0 ] −1 (Rβ˜ − q) m = (Rβ˜ − q) 0 [s˜ 2R(X0Ω−1X) −1R0 ] −1 (Rβ˜ − q) m ∼ Fm,T −k. Exercise: Derive the other three test statistics (in Chapter 6) of the F −Ratio test statistics to test the hypothesis Rβ = q when Ω 6= I. 2.2 Maximum Likelihood Estimators Assume that ε ∼ N(0, σ 2Ω), if X are not stochastic, then by results from ”func￾tions of random variables” (n ⇒ n transformation) we have Y ∼ N(Xβ, σ 2Ω). That is, the log-likelihood function ln f(θ; Y ) = − T 2 ln(2π) − 1 2 ln |σ 2Ω| − 1 2 (Y − Xβ) 0 (σ 2Ω)−1 (Y − Xβ) = − T 2 ln(2π) − T 2 ln σ 2 − 1 2 ln |Ω| − 1 2σ 2 (Y − Xβ) 0Ω −1 (Y − Xβ) where θ = (β1, β2, ..., βk, σ 2 ) 0 since by assumption Ω is known. The necessary condition for maximizing L are ∂L ∂β = 1 σ 2X0Ω −1 (Y − Xβ) = 0 ∂L ∂σ 2 = − T 2σ 2 + 1 2σ 4 (Y − Xβ) 0Ω −1 (Y − Xβ) = 0 The solution are βˆML = (X0Ω −1X) −1X0Ω −1Y, ˆσ 2ML = 1 T (Y − XβˆML) 0Ω −1 (Y − XβˆML), 6
<<向上翻页向下翻页>>
©2008-现在 cucdc.com 高等教育资讯网 版权所有