正在加载图片...
Because we use the standard error about the line so often in regression inference, we just call it s EXample 2(continued) Notice that s"is an average of the squared deviations of the data points from the line, so it qualifies as a variance. We average the squared The first infant had 10 crying intensity and deviations by dividing by n-2, the number of data a later IQ of 87. The predicted IQ for X=10 points less 2. It turns out that if we know n-2 of the n residuals the other two are determined That y=9127+1493x is, n-2 is the degrees of freedom of s. We first =91.27+1493×10=1062 met the idea of degrees of freedom in the case of the ordinary sample standard deviation of n The residual for this observation is observations, which has n- 1 degrees of freedom Now we observe two variables rather than one residual=y-y=87-1062=-192 and the proper degrees of freedom is n-2 rather than n-1 That is, the observed for this infant es 19.2 points below the least- squares line. Calculating s is unpleasant. You must find Repeat this calculation 37 more times, once for each subject. The 38 residuals the predicted response for each x in our ata set. then the residuals. and then s In practice we will use SPSS that does this 19.20-31.13-22.65-15.18 arithmetic instantly. Nonetheless, here is -12.18-15.15-16.63-6.18 an example to help you understand the standard error s -9.15-23.58-9.142.80 -9.14-1.66-6.14-12.60 0.34-8.62 9.8210.820.378.85 10.8719.3410.89-2.55 20.8524.3518.9432.8919 37 • Because we use the standard error about the line so often in regression inference, we just call it s. Notice that is an average of the squared deviations of the data points from the line, so it qualifies as a variance. We average the squared deviations by dividing by n - 2, the number of data points less 2. It turns out that if we know n - 2 of the n residuals, the other two are determined. That is, n - 2 is the degrees of freedom of s. We first met the idea of degrees of freedom in the case of the ordinary sample standard deviation of n observations, which has n - 1 degrees of freedom. Now we observe two variables rather than one, and the proper degrees of freedom is n - 2 rather than n - 1. 2 s 38 • Calculating s is unpleasant. You must find the predicted response for each x in our data set, then the residuals, and then s. In practice we will use SPSS that does this arithmetic instantly. Nonetheless, here is an example to help you understand the standard error s. 20 39 Example 2 (continued) • The first infant had 10 crying intensity and a later IQ of 87. The predicted IQ for x=10 is: ˆ 91.27 1.493 91.27 1.493 10 106.2 y x = + = + ×= The residual for this observation is residual 87 106.2 19.2 = − = − =− y yˆ That is, the observed IQ for this infant lies 19.2 points below the least￾squares line. 40 • Repeat this calculation 37 more times, once for each subject. The 38 residuals are: -19.20 -31.13 -22.65 -15.18 -12.18 -15.15 -16.63 -6.18 -1.70 -22.60 -6.68 -6.17 -9.15 -23.58 -9.14 2.80 -9.14 -1.66 -6.14 -12.60 0.34 -8.62 2.85 14.30 9.82 10.82 0.37 8.85 10.87 19.34 10.89 -2.55 20.85 24.35 18.94 32.89 18.47 51.32
<<向上翻页向下翻页>>
©2008-现在 cucdc.com 高等教育资讯网 版权所有