正在加载图片...
Dayu Wu Applied Statistics Lecture Notes 1 Revision 1.E(∑)=∑Ex 2.Var(x+y)=Var(r)+Var(y)+2Cou(z,y) 3.Cou(ar +biy,azx+b2y)=a2Cou(ax+biy,)b2Cou(az+biy,y) 4.Hypothesis Test:Ho,H,p-value 5.If r~N(u,o2),then N(0,1)and (1-a)CI is (us+s) 2 Linear Regression Basic Concepts 1.=f(x)+ 2.Gauss-Markov: ·E()=0 ·Var(e)=a2 ·Cov(e,号)=0 3.Regrssion Model:E(ylr)=Bo+B,y=(y:...,v)T,r (r1,...,n)T 4.=-3-月x 5.e4=张-风-月 6.∑(a-)0=0 Estimation:OLSE l.Loss Function:Q=∑e=∑(--3x)月 2.First Order Condition:器=0andg器=0 This-2∑(-%-月)=0and-2∑x,(--3x)=0 Equally:∑e:=0and∑x:e=0 3.Center ():=o+ 4.成=7-月元 5.a-盘=等-器=学器 l of6Dayu Wu Applied Statistics Lecture Notes 1 Revision 1. E( Pxi) = PExi 2. V ar(x + y) = V ar(x) + V ar(y) + 2Cov(x, y) 3. Cov(a1x + b1y, a2x + b2y) = a2Cov(a1x + b1y, x) + b2Cov(a1x + b1y, y) 4. Hypothesis Test: H0, H1, p-value 5. If x ∼ N(µ, σ2 ), then x−µ σ ∼ N(0, 1) and (1 − α) CI is ￾ µ − σZ α 2 , µ + σZ α 2  2 Linear Regression Basic Concepts 1. yi = f(xi) + ϵi 2. Gauss-Markov: • E(ϵi) = 0 • V ar(ϵi) = σ 2 • Cov(ϵi , ϵj ) = 0 3. Regrssion Model: E(y|x) = β0 + β1x, y = (y1, . . . , yn) T , x = (x1, . . . , xn) T 4. ϵi = yi − β0 − β1xi 5. ei = yi − βb0 − βb1xi 6. P(xi − x¯)¯y = 0 Estimation: OLSE 1. Loss Function: Q = Pe 2 i = P(yi − β0 − β1xi) 2 2. First Order Condition: ∂Q ∂β0 = 0 and ∂Q ∂β1 = 0 Thus −2 P(yi − β0 − β1xi) = 0 and −2 Pxi(yi − β0 − β1xi) = 0 Equally: Pei = 0 and Pxiei = 0 3. Center (¯y, x¯): y¯ = βb0 + βb1x¯ 4. βb0 = ¯y − βb1x¯ 5. βb1 = Lxy Lxx = P( Pxi−x¯)(yi−y¯) (xi−x¯) 2 = P P (xi−x¯)yi (xi−x¯) 2 = P Pxiyi−nx¯y¯ x 2 i −n(¯x) 2 1 of 6
向下翻页>>
©2008-现在 cucdc.com 高等教育资讯网 版权所有