On Thu, 5 Jun 2003 15:40:52 +0000 "Wegmann (LIST)" <mailinglist_wegmann at web.de> wrote:> Hello R-user > > I want to compute a multiple regression but I would to include a check for > collinearity of the variables. Therefore I would like to use a ridge > regression. > I tried lm.ridge() but I don't know yet how to get p-values (single Pr() and p > of the whole model) out of this model. Can anybody tell me how to get a > similar output like the summary(lm(...)) output? Or if there is another way > (e.g. subcommands of lm() ) to include a correction for collinearity. > > I hope I was precise enough and included all necessary information otherwise I > can add some more infos. > > thanks in advance, Cheers MartinThis doesn't really answer your question but the Design packages's ols function is another way to handle penalized least squares. ols has advantages if you want to differentially penalize different types of terms in the model or if you have any categorical predictors. Ordinary ridge regression does not correctly scale such variables in my opinion. The anova method for ols fits 'works' when you penalize the model but there is some controversy over whether we should be testing biased coefficients. Some believe that hypothesis tests should be done using the unpenalized model. That brings up other ways to handle collinearity: test groups of variables in combination so they don't compete with each other, or collapse them into summary scores (e.g., principal components) before putting them in the model. --- Frank E Harrell Jr Prof. of Biostatistics & Statistics Div. of Biostatistics & Epidem. Dept. of Health Evaluation Sciences U. Virginia School of Medicine http://hesweb1.med.virginia.edu/biostat
Hello R-user I want to compute a multiple regression but I would to include a check for collinearity of the variables. Therefore I would like to use a ridge regression. I tried lm.ridge() but I don't know yet how to get p-values (single Pr() and p of the whole model) out of this model. Can anybody tell me how to get a similar output like the summary(lm(...)) output? Or if there is another way (e.g. subcommands of lm() ) to include a correction for collinearity. I hope I was precise enough and included all necessary information otherwise I can add some more infos. thanks in advance, Cheers Martin
Hi Frank,> From: Frank E Harrell Jr [mailto:fharrell at virginia.edu][snip]> The anova method for ols fits 'works' when you penalize the > model but there is some controversy over whether we should be > testing biased coefficients. Some believe that hypothesis > tests should be done using the unpenalized model. That > brings up other ways to handle collinearity: test groups of > variables in combination so they don't compete with each > other, or collapse them into summary scores (e.g., principal > components) before putting them in the model.I'm not clear about the last point. Suppose three of the variables are nearly collinear. Are you suggesting to replace the variables with the first one or two PCs, and drop the rest? If so, doesn't that also lead to biased estimators? Best, Andy> --- > Frank E Harrell Jr Prof. of Biostatistics & Statistics > Div. of Biostatistics & Epidem. Dept. of Health Evaluation Sciences > U. Virginia School of Medicine > http://hesweb1.med.virginia.edu/biostat------------------------------------------------------------------------------ Notice: This e-mail message, together with any attachments, cont... {{dropped}}