similar to: lm() gives different results to lm.ridge() and SPSS

Displaying 20 results from an estimated 4000 matches similar to: "lm() gives different results to lm.ridge() and SPSS"

2017 May 04
2
lm() gives different results to lm.ridge() and SPSS
Hi Simon, Yes, if I uses coefficients() I get the same results for lm() and lm.ridge(). So that's consistent, at least. Interestingly, the "wrong" number I get from lm.ridge()$coef agrees with the value from SPSS to 5dp, which is an interesting coincidence if these numbers have no particular external meaning in lm.ridge(). Kind regards, Nick ----- Original Message -----
2017 May 05
6
lm() gives different results to lm.ridge() and SPSS
Hi, Here is (I hope) all the relevant output from R. > mean(s1$ZDEPRESSION, na.rm=T) [1] -1.041546e-16 > mean(s1$ZDIVERSITY_PA, na.rm=T) [1] -9.660583e-16 > mean(s1$ZMEAN_PA, na.rm=T) [1] -5.430282e-15 > lm.ridge(ZDEPRESSION ~ ZMEAN_PA * ZDIVERSITY_PA, data=s1)$coef ZMEAN_PA ZDIVERSITY_PA ZMEAN_PA:ZDIVERSITY_PA -0.3962254 -0.3636026
2017 May 05
1
lm() gives different results to lm.ridge() and SPSS
Thanks, I was getting to try this, but got side tracked by actual work... Your analysis reproduces the SPSS unscaled estimates. It still remains to figure out how Nick got > coefficients(lm(ZDEPRESSION ~ ZMEAN_PA * ZDIVERSITY_PA, data=s1)) (Intercept) ZMEAN_PA ZDIVERSITY_PA ZMEAN_PA:ZDIVERSITY_PA 0.07342198 -0.39650356
2017 May 05
1
lm() gives different results to lm.ridge() and SPSS
Hi John, Thanks for the comment... but that appears to mean that SPSS has a big problem. I have always been told that to include an interaction term in a regression, the only way is to do the multiplication by hand. But then it seems to be impossible to stop SPSS from re-standardizing the variable that corresponds to the interaction term. Am I missing something? Is there a way to perform the
2017 May 04
0
lm() gives different results to lm.ridge() and SPSS
Hi Nick, I think that the problem here is your use of $coef to extract the coefficients of the ridge regression. The help for lm.ridge states that coef is a "matrix of coefficients, one row for each value of lambda. Note that these are not on the original scale and are for use by the coef method." I ran a small test with simulated data, code is copied below, and indeed the output from
2017 May 05
0
lm() gives different results to lm.ridge() and SPSS
I asked you before, but in case you missed it: Are you looking at the right place in SPSS output? The UNstandardized coefficients should be comparable to R, i.e. the "B" column, not "Beta". -pd > On 5 May 2017, at 01:58 , Nick Brown <nick.brown at free.fr> wrote: > > Hi Simon, > > Yes, if I uses coefficients() I get the same results for lm() and
2017 May 05
0
lm() gives different results to lm.ridge() and SPSS
I had no problems running regression models in SPSS and R that yielded the same results for these data. The difference you are observing is from fitting different models. In R, you fitted: res <- lm(DEPRESSION ~ ZMEAN_PA * ZDIVERSITY_PA, data=dat) summary(res) The interaction term is the product of ZMEAN_PA and ZDIVERSITY_PA. This is not a standardized variable itself and not the same as
2017 May 05
0
lm() gives different results to lm.ridge() and SPSS
Dear Nick, On 2017-05-05, 9:40 AM, "R-devel on behalf of Nick Brown" <r-devel-bounces at r-project.org on behalf of nick.brown at free.fr> wrote: >>I conjecture that something in the vicinity of >> res <- lm(DEPRESSION ~ scale(ZMEAN_PA) + scale(ZDIVERSITY_PA) + >>scale(ZMEAN_PA * ZDIVERSITY_PA), data=dat) >>summary(res) >> would reproduce the
2003 Jun 05
2
ridge regression
Hello R-user I want to compute a multiple regression but I would to include a check for collinearity of the variables. Therefore I would like to use a ridge regression. I tried lm.ridge() but I don't know yet how to get p-values (single Pr() and p of the whole model) out of this model. Can anybody tell me how to get a similar output like the summary(lm(...)) output? Or if there is
2012 Jul 11
1
Help needed to tackle multicollinearity problem in count data with the help of R
Dear everyone, I'm student of Masters in Statistics (Actuarial) from Central University of Rajasthan, India. I am doing a major project work as a part of the degree. My major project deals with fitting a glm model for the data of car insurance. I'm facing the problem of multicollinearity for this data which is visible by the plotting of data. But I'm not able to test it. In the case
2004 Feb 01
5
Stepwise regression and PLS
Dear all, I am a newcomer to R. I intend to using R to do stepwise regression and PLS with a data set (a 55x20 matrix, with one dependent and 19 independent variable). Based on the same data set, I have done the same work using SPSS and SAS. However, there is much difference between the results obtained by R and SPSS or SAS. In the case of stepwise, SPSS gave out a model with 4 independent
2003 Sep 14
3
Re: Logistic Regression
Christoph Lehman had problems with seperated data in two-class logistic regression. One useful little trick is to penalize the logistic regression using a quadratic penalty on the coefficients. I am sure there are functions in the R contributed libraries to do this; otherwise it is easy to achieve via IRLS using ridge regressions. Then even though the data are separated, the penalized
2009 Feb 26
4
Singularity in a regression?
R friends, In a matrix of 1s and 0s, I'm getting a singularity error. Any helpful ideas? lm(formula = activity ~ metaF + metaCl + metaBr + metaI + metaMe + paraF + paraCl + paraBr + paraI + paraMe) Residuals: Min 1Q Median 3Q Max -4.573e-01 -7.884e-02 3.469e-17 6.616e-02 2.427e-01 Coefficients: (1 not defined because of singularities)
2008 Mar 24
1
Plotting matrix rows with lattice graphics?
Sorry if this is an FAQ, but I haven't found the answer (yet)... I'm trying to plot each row of a simple numeric matrix in a separate panel using the layout features of lattice, but can't make it work - help would be appreciated! Example: > m <- matrix(seq(1:20), nrow=4) > m [,1] [,2] [,3] [,4] [,5] [1,] 1 5 9 13 17 [2,] 2 6 10 14 18
2003 Jul 23
6
Condition indexes and variance inflation factors
Has anyone programmed condition indexes in R? I know that there is a function for variance inflation factors available in the car package; however, Belsley (1991) Conditioning Diagnostics (Wiley) notes that there are several weaknesses of VIFs: e.g. 1) High VIFs are sufficient but not necessary conditions for collinearity 2) VIFs don't diagnose the number of collinearities and 3) No one has
2012 Apr 03
1
how to use condition indexes to test multi-collinearity
Dear Users, I try to calculate condition indexes and variance decomposition proportions in order to test for collinearity using colldiag() in perturb package, I got a large index and two variables with large variance decomposition proportions,but one of them is constant item.I also checked the VIF for that variable, the value is small.The result is as follows: Index intercept V1
2006 Jul 05
2
Colinearity Function in R
Is there a colinearty function implemented in R? I have tried help.search("colinearity") and help.search("collinearity") and have searched for "colinearity" and "collinearity" on http://www.rpad.org/Rpad/Rpad-refcard.pdf but with no success. Many thanks in advance, Peter Lauren.
2009 Jul 21
2
Collinearity in Linear Multiple Regression
Dear all, How can I test for collinearity in the predictor data set for multiple linear regression. Thanks Alex [[alternative HTML version deleted]]
2005 Apr 11
2
dealing with multicollinearity
I have a linear model y~x1+x2 of some data where the coefficient for x1 is higher than I would have expected from theory (0.7 vs 0.88) I wondered whether this would be an artifact due to x1 and x2 being correlated despite that the variance inflation factor is not too high (1.065): I used perturbation analysis to evaluate collinearity library(perturb)
2003 Sep 16
2
gam and concurvity
Hello, in the paper "Avoiding the effects of concurvity in GAM's .." of Figueiras et al. (2003) it is mentioned that in GLM collinearity is taken into account in the calc of se but not in GAM (-> results in confidence interval too narrow, p-value understated, GAM S-Plus version). I haven't found any references to GAM and concurvity or collinearity on the R page. And I