similar to: About lm.ridge in the MASS package

Displaying 20 results from an estimated 50000 matches similar to: "About lm.ridge in the MASS package"

2011 Aug 23
1
obtaining p-values for lm.ridge() coefficients (package 'MASS')
Dear all I'm familiarising myself with Ridge Regressions in R and the following is bugging me: How does one get p-values for the coefficients obtained from MASS::lm.ridge() output (for a given lambda)? Consider the example below (adapted from PRA [1]): > require(MASS) > data(longley) > gr <- lm.ridge(Employed ~ .,longley,lambda = seq(0,0.1,0.001)) > plot(gr) > select(gr)
2009 Jun 04
0
help needed with ridge regression and choice of lambda with lm.ridge!!!
Hi, I'm a beginner in the field, I have to perform the ridge regression with lm.ridge for many datasets, and I wanted to do it in an automatic way. In which way I can automatically choose lambda ? As said, right now I'm using lm.ridge MASS function, which I found quite simple and fast, and I've seen that among the returned values there are HKB estimate of the ridge constant and L-W
2017 May 04
0
lm() gives different results to lm.ridge() and SPSS
Hi Nick, I think that the problem here is your use of $coef to extract the coefficients of the ridge regression. The help for lm.ridge states that coef is a "matrix of coefficients, one row for each value of lambda. Note that these are not on the original scale and are for use by the coef method." I ran a small test with simulated data, code is copied below, and indeed the output from
2017 May 05
0
lm() gives different results to lm.ridge() and SPSS
I asked you before, but in case you missed it: Are you looking at the right place in SPSS output? The UNstandardized coefficients should be comparable to R, i.e. the "B" column, not "Beta". -pd > On 5 May 2017, at 01:58 , Nick Brown <nick.brown at free.fr> wrote: > > Hi Simon, > > Yes, if I uses coefficients() I get the same results for lm() and
2005 Aug 24
1
lm.ridge
Hello, I have posted this mail a few days ago but I did it wrong, I hope is right now: I have the following doubts related with lm.ridge, from MASS package. To show the problem using the Longley example, I have the following doubts: First: I think coefficients from lm(Employed~.,data=longley) should be equal coefficients from lm.ridge(Employed~.,data=longley, lambda=0) why it does not happen?
2017 May 05
0
lm() gives different results to lm.ridge() and SPSS
I had no problems running regression models in SPSS and R that yielded the same results for these data. The difference you are observing is from fitting different models. In R, you fitted: res <- lm(DEPRESSION ~ ZMEAN_PA * ZDIVERSITY_PA, data=dat) summary(res) The interaction term is the product of ZMEAN_PA and ZDIVERSITY_PA. This is not a standardized variable itself and not the same as
2017 May 05
0
lm() gives different results to lm.ridge() and SPSS
Dear Nick, On 2017-05-05, 9:40 AM, "R-devel on behalf of Nick Brown" <r-devel-bounces at r-project.org on behalf of nick.brown at free.fr> wrote: >>I conjecture that something in the vicinity of >> res <- lm(DEPRESSION ~ scale(ZMEAN_PA) + scale(ZDIVERSITY_PA) + >>scale(ZMEAN_PA * ZDIVERSITY_PA), data=dat) >>summary(res) >> would reproduce the
2017 May 04
2
lm() gives different results to lm.ridge() and SPSS
Hi Simon, Yes, if I uses coefficients() I get the same results for lm() and lm.ridge(). So that's consistent, at least. Interestingly, the "wrong" number I get from lm.ridge()$coef agrees with the value from SPSS to 5dp, which is an interesting coincidence if these numbers have no particular external meaning in lm.ridge(). Kind regards, Nick ----- Original Message -----
2011 Aug 06
0
ridge regression - covariance matrices of ridge coefficients
For an application of ridge regression, I need to get the covariance matrices of the estimated regression coefficients in addition to the coefficients for all values of the ridge contstant, lambda. I've studied the code in MASS:::lm.ridge, but don't see how to do this because the code is vectorized using one svd calculation. The relevant lines from lm.ridge, using X, Y are:
2017 May 05
1
lm() gives different results to lm.ridge() and SPSS
Thanks, I was getting to try this, but got side tracked by actual work... Your analysis reproduces the SPSS unscaled estimates. It still remains to figure out how Nick got > coefficients(lm(ZDEPRESSION ~ ZMEAN_PA * ZDIVERSITY_PA, data=s1)) (Intercept) ZMEAN_PA ZDIVERSITY_PA ZMEAN_PA:ZDIVERSITY_PA 0.07342198 -0.39650356
2010 Apr 26
0
lm.ridge {MASS} intercept questions
I am trying to understand the code for lm.ridge from the MASS package. Here is the part I am having trouble understanding: if(Inter <- attr(Terms, "intercept")) { Xm <- colMeans(X[, -Inter]) Ym <- mean(Y) p <- p - 1 X <- X[, -Inter] - rep(Xm, rep(n, p)) Y <- Y - Ym } else Ym <- Xm <- NA Xscale <- drop(rep(1/n, n) %*% X^2)^0.5 X <- X/rep(Xscale, rep.int(n,
2017 May 05
1
lm() gives different results to lm.ridge() and SPSS
Hi John, Thanks for the comment... but that appears to mean that SPSS has a big problem. I have always been told that to include an interaction term in a regression, the only way is to do the multiplication by hand. But then it seems to be impossible to stop SPSS from re-standardizing the variable that corresponds to the interaction term. Am I missing something? Is there a way to perform the
2017 May 04
4
lm() gives different results to lm.ridge() and SPSS
Hallo, I hope I am posting to the right place. I was advised to try this list by Ben Bolker (https://twitter.com/bolkerb/status/859909918446497795). I also posted this question to StackOverflow (http://stackoverflow.com/questions/43771269/lm-gives-different-results-from-lm-ridgelambda-0). I am a relative newcomer to R, but I wrote my first program in 1975 and have been paid to program in about
2009 Aug 19
1
ridge regression
Dear all, I considered an ordinary ridge regression problem. I followed three different ways: 1. estimate beta without any standardization 2. estimate standardized beta (standardizing X and y) and then again convert back 3. estimate beta using lm.ridge() function X<-matrix(c(1,2,9,3,2,4,7,2,3,5,9,1),4,3) y<-t(as.matrix(cbind(2,3,4,5))) n<-nrow(X) p<-ncol(X) #Without
2017 May 05
6
lm() gives different results to lm.ridge() and SPSS
Hi, Here is (I hope) all the relevant output from R. > mean(s1$ZDEPRESSION, na.rm=T) [1] -1.041546e-16 > mean(s1$ZDIVERSITY_PA, na.rm=T) [1] -9.660583e-16 > mean(s1$ZMEAN_PA, na.rm=T) [1] -5.430282e-15 > lm.ridge(ZDEPRESSION ~ ZMEAN_PA * ZDIVERSITY_PA, data=s1)$coef ZMEAN_PA ZDIVERSITY_PA ZMEAN_PA:ZDIVERSITY_PA -0.3962254 -0.3636026
2013 Apr 27
1
Selecting ridge regression coefficients for minimum GCV
Hi all, I have run a ridge regression as follows: reg=lm.ridge(final$l~final$lag1+final$lag2+final$g+final$u, lambda=seq(0,10,0.01)) Then I enter : select(reg) and it returns: modified HKB estimator is 19.3409 modified L-W estimator is 36.18617 smallest value of GCV at 10 I think it means that it is advisable to
2009 Aug 19
1
Ridge regression [Repost]
Dear all, For an ordinary ridge regression problem, I followed three different approaches: 1. estimate beta without any standardization 2. estimate standardized beta (standardizing X and y) and then again convert back 3. estimate beta using lm.ridge() function X<-matrix(c(1,2,9,3,2,4,7,2,3,5,9,1),4,3) y<-as.matrix(c(2,3,4,5)) n<-nrow(X) p<-ncol(X) #Without standardization
2009 Aug 01
2
Cox ridge regression
Hello, I have questions regarding penalized Cox regression using survival package (functions coxph() and ridge()). I am using R 2.8.0 on Ubuntu Linux and survival package version 2.35-4. Question 1. Consider the following example from help(ridge): > fit1 <- coxph(Surv(futime, fustat) ~ rx + ridge(age, ecog.ps, theta=1), ovarian) As I understand, this builds a model in which `rx' is
2008 Feb 14
0
GCV in lm.ridge (MASS) (PR#10755)
Full_Name: Andrew Robinson Version: 2.6.2 Patched (2008-02-12 r44439) OS: FreeBSD 6.3-RC1 Submission from: (NULL) (211.28.206.186) I believe that the computation for GCV is incorrect in the lm.ridge function in MASS. >From lm.ridge: GCV <- colSums((Y - X %*% coef)^2)/ (n - colSums(matrix(d^2/div, dx)))^2 The denominator does not tally with the formula on p. 141 of Ripley's
2011 Apr 27
0
treatment of factors and errors in ridge() function with coxph
I am trying to fit a large Cox model with many predictors. Because there are many predictors, I would like to use the ridge() function to get penalized ml estimates for all coefficients. The problems are that: 1. When I include a factor (like race) in the ridge() function, dummy variables are not created. The resulting model has a single coefficient for the race variable, and I have