Please consult any basic book on linear regression/linear models; for
example, APPLIED REGRESSION ANALYSIS by Draper and Smith. Or google on
"adjusted R-Squared."
In brief, Adjusted R-squared arrempts to compensate for the property that
adding extra regressors -- even random ones -- to a linear model **always**
increases R-squared. It does so by penalizing for extra regressors. So
R-squared is almost never useful as an indication of the quality of fit.
Adjusted R-squared might be (but often isn't either because of selection
bias among other reasons).
-- Bert Gunter
Genentech Non-Clinical Statistics
South San Francisco, CA
"The business of the statistician is to catalyze the scientific learning
process." - George E. P. Box
> -----Original Message-----
> From: r-help-bounces at stat.math.ethz.ch
> [mailto:r-help-bounces at stat.math.ethz.ch] On Behalf Of
> Benjamin M. Osborne
> Sent: Monday, April 18, 2005 3:52 PM
> To: R help
> Subject: [R] R-squared in summary(lm...)
>
> What is the difference between the two R-squareds returned
> for a linear
> regression by summary(lm...)? When might one report multiple
> vs. adjusted
> R-squared?
> Thank you,
> Ben Osborne
>
> --
> Botany Department
> University of Vermont
> 109 Carrigan Drive
> Burlington, VT 05405
>
> benjamin.osborne at uvm.edu
> phone: 802-656-0297
> fax: 802-656-0440
>
> ______________________________________________
> R-help at stat.math.ethz.ch mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide!
> http://www.R-project.org/posting-guide.html
>