正在加载图片...
Dayu Wu Applied Statistics Lecture Notes 1 Revision 1.E(∑)=∑Ex 2.Var(x+y)=Var(z)+Var(y)+2Cov(x,y) 3.Cou(aiz+by,azx bay)=a2Cou(aix+by,)b2Cou(aix+by,y) 4.Hypothesis Test:Ho,H1,p-value 5.If ~N(u,2),thenN(0,1)and (1-a)CI is (u-zgu+z) 2 Linear Regression Basic Concepts 1.:=f(x)+ 2.Gauss-Markov: ·E(e)=0 ·Var(e)=g2 ·Cov(e,e)=0 3.Regrssion Model:E(yl)=Bo+B,y=(y,...,Un)T,x=(x1,...,n)T 4.G=-3-月西 5.e=h-成-房1 6.∑(:-)万=0 Estimation:OLSE 1.Loss Function:Q=∑e=∑(:--1x)2 2.First Order Condition:器=0and9器=0 Ths-2∑(h--房x)=0and-2∑x,(-%-)=0 Equally:∑e=0and∑x:e=0 3.Center(,:可=民+A元 4.高=7-月正 5.=兴==器=爵 1of6 Dayu Wu Applied Statistics Lecture Notes 1 Revision 1. E( Pxi) = PExi 2. V ar(x + y) = V ar(x) + V ar(y) + 2Cov(x, y) 3. Cov(a1x + b1y, a2x + b2y) = a2Cov(a1x + b1y, x) + b2Cov(a1x + b1y, y) 4. Hypothesis Test: H0, H1, p-value 5. If x ∼ N(µ, σ2 ), then x−µ σ ∼ N(0, 1) and (1 − α) CI is ￾ µ − σZ α 2 , µ + σZ α 2  2 Linear Regression Basic Concepts 1. yi = f(xi) + ϵi 2. Gauss-Markov: • E(ϵi) = 0 • V ar(ϵi) = σ 2 • Cov(ϵi , ϵj ) = 0 3. Regrssion Model: E(y|x) = β0 + β1x, y = (y1, . . . , yn) T , x = (x1, . . . , xn) T 4. ϵi = yi − β0 − β1xi 5. ei = yi − βb0 − βb1xi 6. P(xi − x¯)¯y = 0 Estimation: OLSE 1. Loss Function: Q = Pe 2 i = P(yi − β0 − β1xi) 2 2. First Order Condition: ∂Q ∂β0 = 0 and ∂Q ∂β1 = 0 Thus −2 P(yi − β0 − β1xi) = 0 and −2 Pxi(yi − β0 − β1xi) = 0 Equally: Pei = 0 and Pxiei = 0 3. Center (¯y, x¯): y¯ = βb0 + βb1x¯ 4. βb0 = ¯y − βb1x¯ 5. βb1 = Lxy Lxx = P( Pxi−x¯)(yi−y¯) (xi−x¯) 2 = P P (xi−x¯)yi (xi−x¯) 2 = P Pxiyi−nx¯y¯ x 2 i −n(¯x) 2 1 of 6
向下翻页>>
©2008-现在 cucdc.com 高等教育资讯网 版权所有