Hi Folks, Could anyone point me to a good reference on linear regression models? Specifically, I am trying to gain an intuitive feel for how the standard error values are calculated for the parameter estimates. My understanding is that these are computed using the variance-covariance matrix computed from the input data matrix. Although I think I understand the math, I still don't have a good gut feel for why one parameter is attributed with a larger standard error than the next parameter. Also, I am interested in knowing how to test that two parameters are significantly different from one another. Thanks in advance for your help. -James [[alternative HTML version deleted]]
Dear James, A very nice way of understanding these matters intuitively is to express them geometrically using data and confidence ellipses (for two predictors and their coefficients) and ellipsoids (more generally). The same ideas apply to linear hypotheses, such as for the difference between two coefficients. A good elementary treatment may be founds in Georges Monette, "Geometry of multiple regression and 3-D graphics," in Fox and Long (eds.), Modern Methods of Data Analysis, Sage, 1990. Some regression texts also develop the geometry of regression and linear models. I hope that this helps, John At 04:04 AM 6/21/2003 -0700, J Ireland wrote:>Hi Folks, > >Could anyone point me to a good reference on linear regression >models? Specifically, I am trying to gain an intuitive feel for how the >standard error values are calculated for the parameter estimates. My >understanding is that these are computed using the variance-covariance >matrix computed from the input data matrix. Although I think I understand >the math, I still don't have a good gut feel for why one parameter is >attributed with a larger standard error than the next parameter. > >Also, I am interested in knowing how to test that two parameters are >significantly different from one another. > >Thanks in advance for your help. >-James----------------------------------------------------- John Fox Department of Sociology McMaster University Hamilton, Ontario, Canada L8S 4M4 email: jfox at mcmaster.ca phone: 905-525-9140x23604 web: www.socsci.mcmaster.ca/jfox
John Fox was kind enough to reply, but didn't recommend IMHO the best book on regression models: his own, John Fox, _An R and S-Plus Companion to Applied Regression_, Sage, 2002. ap ---------------------------------------------------------------------- Andrew J Perrin - http://www.unc.edu/~aperrin Assistant Professor of Sociology, U of North Carolina, Chapel Hill clists at perrin.socsci.unc.edu * andrew_perrin (at) unc.edu On Sat, 21 Jun 2003, J Ireland wrote:> Hi Folks, > > Could anyone point me to a good reference on linear regression models? Specifically, I am trying to gain an intuitive feel for how the standard error values are calculated for the parameter estimates. My understanding is that these are computed using the variance-covariance matrix computed from the input data matrix. Although I think I understand the math, I still don't have a good gut feel for why one parameter is attributed with a larger standard error than the next parameter. > > Also, I am interested in knowing how to test that two parameters are significantly different from one another. > > Thanks in advance for your help. > -James > > > [[alternative HTML version deleted]] > > ______________________________________________ > R-help at stat.math.ethz.ch mailing list > https://www.stat.math.ethz.ch/mailman/listinfo/r-help >