Displaying 20 results from an estimated 200000 matches similar to: "I have a question"
2017 May 05
0
lm() gives different results to lm.ridge() and SPSS
Dear Nick,
On 2017-05-05, 9:40 AM, "R-devel on behalf of Nick Brown"
<r-devel-bounces at r-project.org on behalf of nick.brown at free.fr> wrote:
>>I conjecture that something in the vicinity of
>> res <- lm(DEPRESSION ~ scale(ZMEAN_PA) + scale(ZDIVERSITY_PA) +
>>scale(ZMEAN_PA * ZDIVERSITY_PA), data=dat)
>>summary(res)
>> would reproduce the
2017 May 05
0
lm() gives different results to lm.ridge() and SPSS
I had no problems running regression models in SPSS and R that yielded the same results for these data.
The difference you are observing is from fitting different models. In R, you fitted:
res <- lm(DEPRESSION ~ ZMEAN_PA * ZDIVERSITY_PA, data=dat)
summary(res)
The interaction term is the product of ZMEAN_PA and ZDIVERSITY_PA. This is not a standardized variable itself and not the same as
2017 May 05
1
lm() gives different results to lm.ridge() and SPSS
Thanks, I was getting to try this, but got side tracked by actual work...
Your analysis reproduces the SPSS unscaled estimates. It still remains to figure out how Nick got
>
coefficients(lm(ZDEPRESSION ~ ZMEAN_PA * ZDIVERSITY_PA, data=s1))
(Intercept) ZMEAN_PA ZDIVERSITY_PA ZMEAN_PA:ZDIVERSITY_PA
0.07342198 -0.39650356
2017 May 05
1
lm() gives different results to lm.ridge() and SPSS
Hi John,
Thanks for the comment... but that appears to mean that SPSS has a big problem. I have always been told that to include an interaction term in a regression, the only way is to do the multiplication by hand. But then it seems to be impossible to stop SPSS from re-standardizing the variable that corresponds to the interaction term. Am I missing something? Is there a way to perform the
2017 May 05
0
lm() gives different results to lm.ridge() and SPSS
I asked you before, but in case you missed it: Are you looking at the right place in SPSS output?
The UNstandardized coefficients should be comparable to R, i.e. the "B" column, not "Beta".
-pd
> On 5 May 2017, at 01:58 , Nick Brown <nick.brown at free.fr> wrote:
>
> Hi Simon,
>
> Yes, if I uses coefficients() I get the same results for lm() and
2011 Aug 23
1
obtaining p-values for lm.ridge() coefficients (package 'MASS')
Dear all
I'm familiarising myself with Ridge Regressions in R and the following
is bugging me: How does one get p-values for the coefficients obtained
from MASS::lm.ridge() output (for a given lambda)? Consider the
example below (adapted from PRA [1]):
> require(MASS)
> data(longley)
> gr <- lm.ridge(Employed ~ .,longley,lambda = seq(0,0.1,0.001))
> plot(gr)
> select(gr)
2017 May 04
0
lm() gives different results to lm.ridge() and SPSS
Hi Nick,
I think that the problem here is your use of $coef to extract the coefficients of the ridge regression. The help for lm.ridge states that coef is a "matrix of coefficients, one row for each value of lambda. Note that these are not on the original scale and are for use by the coef method."
I ran a small test with simulated data, code is copied below, and indeed the output from
2017 May 04
2
lm() gives different results to lm.ridge() and SPSS
Hi Simon,
Yes, if I uses coefficients() I get the same results for lm() and lm.ridge(). So that's consistent, at least.
Interestingly, the "wrong" number I get from lm.ridge()$coef agrees with the value from SPSS to 5dp, which is an interesting coincidence if these numbers have no particular external meaning in lm.ridge().
Kind regards,
Nick
----- Original Message -----
2006 Aug 21
4
question about 'coef' method and fitted_value calculation
Dear all,
I am trying to calculate the fitted values using a ridge model
(lm.ridge(), MASS library). Since the predict() does not work for lm.ridge
object, I want to get the fitted_value from the coefficients information.
The following are the codes I use:
fit = lm.ridge(myY~myX,lambda=lamb,scales=F,coef=T)
coeff = fit$coef
However, it seems that "coeff" (or "fit$coef") is
2011 Aug 06
0
ridge regression - covariance matrices of ridge coefficients
For an application of ridge regression, I need to get the covariance
matrices of the estimated regression
coefficients in addition to the coefficients for all values of the ridge
contstant, lambda.
I've studied the code in MASS:::lm.ridge, but don't see how to do this
because the code is vectorized using
one svd calculation. The relevant lines from lm.ridge, using X, Y are:
2010 Apr 26
0
lm.ridge {MASS} intercept questions
I am trying to understand the code for lm.ridge from the MASS package.
Here is the part I am having trouble understanding:
if(Inter <- attr(Terms, "intercept"))
{
Xm <- colMeans(X[, -Inter])
Ym <- mean(Y)
p <- p - 1
X <- X[, -Inter] - rep(Xm, rep(n, p))
Y <- Y - Ym
} else Ym <- Xm <- NA
Xscale <- drop(rep(1/n, n) %*% X^2)^0.5
X <- X/rep(Xscale, rep.int(n,
2009 Aug 21
1
applying summary() to an object created with ols()
Hello R-list,
I am trying to calculate a ridge regression using first the *lm.ridge()*
function from the MASS package and then applying the obtained Hoerl
Kennard Baldwin (HKB) estimator as a penalty scalar to the *ols()*
function provided by Frank Harrell in his Design package.
It looks like this:
> rrk1<-lm.ridge(lnbcpc ~ lntex + lnbeerp + lnwinep + lntemp + pop,
subset(aa,
2005 Aug 24
1
lm.ridge
Hello, I have posted this mail a few days ago but I did it wrong, I hope
is right now:
I have the following doubts related with lm.ridge, from MASS package. To
show the problem using the Longley example, I have the following doubts:
First: I think coefficients from lm(Employed~.,data=longley) should be
equal coefficients from lm.ridge(Employed~.,data=longley, lambda=0) why
it does not happen?
2017 May 05
6
lm() gives different results to lm.ridge() and SPSS
Hi,
Here is (I hope) all the relevant output from R.
> mean(s1$ZDEPRESSION, na.rm=T) [1] -1.041546e-16 > mean(s1$ZDIVERSITY_PA, na.rm=T) [1] -9.660583e-16 > mean(s1$ZMEAN_PA, na.rm=T) [1] -5.430282e-15 > lm.ridge(ZDEPRESSION ~ ZMEAN_PA * ZDIVERSITY_PA, data=s1)$coef ZMEAN_PA ZDIVERSITY_PA ZMEAN_PA:ZDIVERSITY_PA
-0.3962254 -0.3636026
2009 Dec 02
1
Ridge regression
Dear list,
I have a couple of questions concerning ridge regression. I am using the
lm.ridge(...) function in order to fit a model to my microarray data.
Thus *model=lm.ridge(...)*
I retrieve some coefficients and some scales for each gene. First of all, I
would like to ask: the real coefficients of the model are not included in
the first argument of the output but in the result of coef(model),
2017 May 04
4
lm() gives different results to lm.ridge() and SPSS
Hallo,
I hope I am posting to the right place. I was advised to try this list by Ben Bolker (https://twitter.com/bolkerb/status/859909918446497795). I also posted this question to StackOverflow (http://stackoverflow.com/questions/43771269/lm-gives-different-results-from-lm-ridgelambda-0). I am a relative newcomer to R, but I wrote my first program in 1975 and have been paid to program in about
2009 Aug 15
0
coefficient p-value in ridge regression
Hello. I'have a problem with RIDGE REGRESSION.
I've used lm.ridge function to estimate coefficients of my model. Why in the summary of models not appears t value, Pr(>|t|) and significance stars?
How I can calculate coefficient's p-value in ridge regression?
Thanks!
[[alternative HTML version deleted]]
2013 Apr 27
1
Selecting ridge regression coefficients for minimum GCV
Hi all,
I have run a ridge regression as follows:
reg=lm.ridge(final$l~final$lag1+final$lag2+final$g+final$u,
lambda=seq(0,10,0.01))
Then I enter :
select(reg) and it returns: modified HKB estimator is 19.3409
modified L-W estimator is 36.18617
smallest value of GCV at 10
I think it means that it is advisable to
2013 Apr 26
1
Regression coefficients
Hi all,
I have run a ridge regression as follows:
reg=lm.ridge(final$l~final$lag1+final$lag2+final$g+final$g+final$u,
lambda=seq(0,10,0.01))
Then I enter :
select(reg) and it returns: modified HKB estimator is 19.3409
modified L-W estimator is 36.18617
smallest value of GCV at 10
I think it means that it is
2003 Jun 05
2
ridge regression
Hello R-user
I want to compute a multiple regression but I would to include a check for
collinearity of the variables. Therefore I would like to use a ridge
regression.
I tried lm.ridge() but I don't know yet how to get p-values (single Pr() and p
of the whole model) out of this model. Can anybody tell me how to get a
similar output like the summary(lm(...)) output? Or if there is