当前位置:高等教育资讯网  >  中国高校课件下载中心  >  大学文库  >  浏览文档

《计量经济学》课程教学资源(PPT课件讲稿,英文版)ch03 Multiple regression Analysis

资源类别:文库,文档格式:PPT,文档页数:28,文件大小:141.5KB,团购合买
Parallels with Simple regression Bo is still the intercept B, to Bk all called slope parameters u is still the error term(or disturbance) Still need to make a zero conditional mean assumption, so now assume that E(lx,x2…,x)=0 Still minimizing the sum of squared residuals. so have k+l first order conditions Economics 20- Prof anderson
点击下载完整版文档(PPT)

Multiple regression analysis y-Bo+Bx+ Bx2+... Bkxk+ Estimation Economics 20- Prof anderson

Economics 20 - Prof. Anderson 1 Multiple Regression Analysis y = b0 + b1 x1 + b2 x2 + . . . bk xk + u 1. Estimation

Parallels with Simple regression ◆ Bo is still the intercept oB to Pk all called slope parameters uis still the error term(or disturbance Still need to make a zero conditional mean assumption, so now assume that ◆B(lx1x2,…,x)=0 o Still minimizing the sum of squared residuals. so have k+l first order conditions Economics 20- Prof anderson

Economics 20 - Prof. Anderson 2 Parallels with Simple Regression b0 is still the intercept b1 to bk all called slope parameters u is still the error term (or disturbance) Still need to make a zero conditional mean assumption, so now assume that E(u|x1 ,x2 , …,xk ) = 0 Still minimizing the sum of squared residuals, so have k+1 first order conditions

Interpreting Multiple regression y=Bo+B,,+ B2x2+.+Bkxk, So △y=△1x1+△B2x2+…+△kxk so holding x, ,,xk fixed implies that △y=△Bx1, that is each B has a ceteris paribus interpreta tion Economics 20- Prof anderson

Economics 20 - Prof. Anderson 3 Interpreting Multiple Regression a interpreta tion , that is each has ˆ ˆ so holding ,..., fixed implies that , ˆ ... ˆ ˆ ˆ ,so ˆ ... ˆ ˆ ˆ ˆ 1 1 2 1 1 2 2 0 1 1 2 2 ceteris paribus y x x x y x x x y x x x k k k k k b b b b b b b b b  =   =  +  + +  = + + + +

A“ Partialling Out” Interpretation Consider t he case where k=2 . ie y=Bo+B,x,+B,x,, then 1,where rl are the residuals from the estimated regression x=ro+y2I Economics 20- Prof anderson 4

Economics 20 - Prof. Anderson 4 A “Partialling Out” Interpretation ( ) 1 0 2 2 1 2 1 1 1 0 1 1 2 2 regression ˆ ˆ ˆ ˆ the residuals from the estimated ˆ ˆ , where ˆ are ˆ , then ˆ ˆ ˆ ˆ Consider t he case where 2, i.e. x x r y r r y x x k i i i i   b b b b = + = = + + =  

Partialling out continued e Previous equation implies that regressing y on x, and x, gives same effect of x, as regressing y on residuals from a regression ofx, on x 2 o This means only the part of xlt that is uncorrelated with xi2 are being related to y so we re estimating the effect ofx, on y after x, has been"partialled out Economics 20- Prof anderson 5

Economics 20 - Prof. Anderson 5 “Partialling Out” continued Previous equation implies that regressing y on x1 and x2 gives same effect of x1 as regressing y on residuals from a regression of x1 on x2 This means only the part of xi1 that is uncorrelated with xi2 are being related to yi so we’re estimating the effect of x1 on y after x2 has been “partialled out

Simple vs multiple reg estimate Compare the simple regression y=Bo+B,x, with the multiple regression y=Bo+B,,+B2x2 Genera,B1≠ B, unless: B,=0(ie. no partial effect of x,OR x, and x, are uncorrelat ed in the sample Economics 20- Prof anderson 6

Economics 20 - Prof. Anderson 6 Simple vs Multiple Reg Estimate and are uncorrelat ed in the sample ˆ 0 (i.e. no partial effect of ) O R unless : ˆ ~ Generally, ˆ ˆ ˆ with the multiple regression ˆ ~ ~ ~ Compare the simple regression 1 2 2 2 1 1 0 1 1 2 2 0 1 1 x x x y x x y x =  = + + = + b b b b b b b b

Goodness-of-Fit We can think of each observatio n as being made up of an explained part, and an unexplained d part, y,=y,+u We then define the following >O-y is the total sum of squares(SST) >O-v is the explained sum of squares(SSE) >u? is the residual sum of squares(SSR) Then sst=sse+ ssr Economics 20- Prof anderson 7

Economics 20 - Prof. Anderson 7 Goodness-of-Fit ( ) ( ) Then SST SSE SSR ˆ is the residual sum of squares (SSR) ˆ is the explained sum of squares (SSE) is the total sum of squares (SST) ˆ ˆ We then define the following : up of an explained part, and an unexplaine d part, We can think of each observatio n as being made 2 2 2 = + − − = +    i i i i i i u y y y y y y u

Goodness-of-Fit(continued) How do we think about how well our sample regression line fits our sample data? Can compute the fraction of the total sum of squares (sst)that is explained by the model, call this the R-squared of regression D R2= SSE/SST=1- SSR/SST Economics 20- Prof anderson 8

Economics 20 - Prof. Anderson 8 Goodness-of-Fit (continued) How do we think about how well our sample regression line fits our sample data? Can compute the fraction of the total sum of squares (SST) that is explained by the model, call this the R-squared of regression R2 = SSE/SST = 1 – SSR/SST

Goodness-of-Fit(continued) We can also think of R as being equal to the squared correlatio n coefficien t between the actual y, and the values y C(-y)6-5 Economics 20- Prof anderson 9

Economics 20 - Prof. Anderson 9 Goodness-of-Fit (continued) ( ( )( )) (( ) )(( ) )  − − − − = 2 2 2 2 2 ˆ ˆ ˆ ˆ the actual and the values ˆ the squared correlatio n coefficien t between We can also think of as being equal to y y y y y y y y R y y R i i i i i i

More about R-squared o R2 can never decrease when another independent variable is added to a regression, and usually will increase o Because R2 will usually increase with the number of independent variables, it is not a good way to compare models Economics 20- Prof anderson 10

Economics 20 - Prof. Anderson 10 More about R-squared R2 can never decrease when another independent variable is added to a regression, and usually will increase Because R2 will usually increase with the number of independent variables, it is not a good way to compare models

点击下载完整版文档(PPT)VIP每日下载上限内不扣除下载券和下载次数;
按次数下载不扣除下载券;
24小时内重复下载只扣除一次;
顺序:VIP每日次数-->可用次数-->下载券;
共28页,试读已结束,阅读完整版请下载
相关文档

关于我们|帮助中心|下载说明|相关软件|意见反馈|联系我们

Copyright © 2008-现在 cucdc.com 高等教育资讯网 版权所有