正在加载图片...
Multivariate Linear Models in R An Appendix to An R Companion to Applied Regression,Second Edition John Fox Sanford Weisberg last revision:28 July 2011 Abstract The multivariate linear model is ()()(m)(n) where Y is a matrix of n observations on m response variables;X is a model matrix with columns for k+1 regressors,typically including an initial column of 1s for the regression constant;B is a matrix of regression coefficients,one column for each response variable;and E is a matrix of errors.This model can be fit with the lm function in R,where the left-hand side of the model comprises a matrix of response variables,and the right-hand side is specified exactly as for a univariate linear model (i.e.,with a single response variable).This appendix to Fox and Weisberg(2011)explains how to use the Anova and linearHypothesis functions in the car package to test hypotheses for parameters in multivariate linear models,including models for repeated-measures data. 1 Basic Ideas The multivariate linear model accommodates two or more response variables.The theory of mul- tivariate linear models is developed very briefly in this section.Much more extensive treatments may be found in the recommended reading for this appendix. The multivariate general linear model is Y,= XB+E (n×m)(n×k+1)(k+1×m)(m×m) where Y is a matrix of n observations on m response variables;X is a model matrix with columns for k+1 regressors,typically including an initial column of 1s for the regression constant;B is a matrix of regression coefficients,one column for each response variable;and E is a matrix of errors.1 The contents of the model matrix are exactly as in the univariate linear model (as described in Ch.4 of An R Companion to Applied Regression,Fox and Weisberg,2011-hereafter,the "R Companion"), and may contain,therefore,dummy regressors representing factors,polynomial or regression-spline terms,interaction regressors,and so on. The assumptions of the multivariate linear model concern the behavior of the errors:Let e represent the ith row of E.Then e~Nm(0,>)where is a nonsingular error-covariance matrix, constant across observations;e and e are independent for ii;and X is fixed or independent A typographical note:B and E are,respectively,the upper-case Greek letters Beta and Epsilon.Because these are indistinguishable from the corresponding Roman letters B and E,we will denote the estimated regression coefficients as B and the residuals as E. 1Multivariate Linear Models in R An Appendix to An R Companion to Applied Regression, Second Edition John Fox & Sanford Weisberg last revision: 28 July 2011 Abstract The multivariate linear model is Y (n×m) = X (n×k+1) B (k+1×m) + E (n×m) where Y is a matrix of n observations on m response variables; X is a model matrix with columns for k + 1 regressors, typically including an initial column of 1s for the regression constant; B is a matrix of regression coefficients, one column for each response variable; and E is a matrix of errors. This model can be fit with the lm function in R, where the left-hand side of the model comprises a matrix of response variables, and the right-hand side is specified exactly as for a univariate linear model (i.e., with a single response variable). This appendix to Fox and Weisberg (2011) explains how to use the Anova and linearHypothesis functions in the car package to test hypotheses for parameters in multivariate linear models, including models for repeated-measures data. 1 Basic Ideas The multivariate linear model accommodates two or more response variables. The theory of mul￾tivariate linear models is developed very briefly in this section. Much more extensive treatments may be found in the recommended reading for this appendix. The multivariate general linear model is Y (n×m) = X (n×k+1) B (k+1×m) + E (n×m) where Y is a matrix of n observations on m response variables; X is a model matrix with columns for k+ 1 regressors, typically including an initial column of 1s for the regression constant; B is a matrix of regression coefficients, one column for each response variable; and E is a matrix of errors.1 The contents of the model matrix are exactly as in the univariate linear model (as described in Ch. 4 of An R Companion to Applied Regression, Fox and Weisberg, 2011—hereafter, the “R Companion”), and may contain, therefore, dummy regressors representing factors, polynomial or regression-spline terms, interaction regressors, and so on. The assumptions of the multivariate linear model concern the behavior of the errors: Let ε 0 i represent the ith row of E. Then ε 0 i ∼ Nm(0, Σ), where Σ is a nonsingular error-covariance matrix, constant across observations; ε 0 i and ε 0 i 0 are independent for i 6= i 0 ; and X is fixed or independent 1A typographical note: B and E are, respectively, the upper-case Greek letters Beta and Epsilon. Because these are indistinguishable from the corresponding Roman letters B and E, we will denote the estimated regression coefficients as Bb and the residuals as Eb. 1
向下翻页>>
©2008-现在 cucdc.com 高等教育资讯网 版权所有