similar to: coefficient p-value in ridge regression

Displaying 20 results from an estimated 20000 matches similar to: "coefficient p-value in ridge regression"

2009 Aug 01
2
Cox ridge regression
Hello, I have questions regarding penalized Cox regression using survival package (functions coxph() and ridge()). I am using R 2.8.0 on Ubuntu Linux and survival package version 2.35-4. Question 1. Consider the following example from help(ridge): > fit1 <- coxph(Surv(futime, fustat) ~ rx + ridge(age, ecog.ps, theta=1), ovarian) As I understand, this builds a model in which `rx' is
2013 Apr 27
1
Selecting ridge regression coefficients for minimum GCV
Hi all, I have run a ridge regression as follows: reg=lm.ridge(final$l~final$lag1+final$lag2+final$g+final$u, lambda=seq(0,10,0.01)) Then I enter : select(reg) and it returns: modified HKB estimator is 19.3409 modified L-W estimator is 36.18617 smallest value of GCV at 10 I think it means that it is advisable to
2011 Aug 23
1
obtaining p-values for lm.ridge() coefficients (package 'MASS')
Dear all I'm familiarising myself with Ridge Regressions in R and the following is bugging me: How does one get p-values for the coefficients obtained from MASS::lm.ridge() output (for a given lambda)? Consider the example below (adapted from PRA [1]): > require(MASS) > data(longley) > gr <- lm.ridge(Employed ~ .,longley,lambda = seq(0,0.1,0.001)) > plot(gr) > select(gr)
2011 Aug 06
0
ridge regression - covariance matrices of ridge coefficients
For an application of ridge regression, I need to get the covariance matrices of the estimated regression coefficients in addition to the coefficients for all values of the ridge contstant, lambda. I've studied the code in MASS:::lm.ridge, but don't see how to do this because the code is vectorized using one svd calculation. The relevant lines from lm.ridge, using X, Y are:
2010 Oct 04
1
Ridge regression and mixed models
Dear R users, An equivalence between linear mixed model formulation and penalized regression models (including the ridge regression and penalized regression splines) has proven to be very useful in many aspects. Examples include the use of the lme() function in the library(nlme) to fit smooth models including the estimation of a smoothing parameter using REML. My question concerns the use of
2009 Jun 04
0
help needed with ridge regression and choice of lambda with lm.ridge!!!
Hi, I'm a beginner in the field, I have to perform the ridge regression with lm.ridge for many datasets, and I wanted to do it in an automatic way. In which way I can automatically choose lambda ? As said, right now I'm using lm.ridge MASS function, which I found quite simple and fast, and I've seen that among the returned values there are HKB estimate of the ridge constant and L-W
2009 Mar 17
1
Likelihood of a ridge regression (lm.ridge)?
Dear all, I want to get the likelihood (or AIC or BIC) of a ridge regression model using lm.ridge from the MASS library. Yet, I can't really find it. As lm.ridge does not return a standard fit object, it doesn't work with functions like e.g. BIC (nlme package). Is there a way around it? I would calculate it myself, but I'm not sure how to do that for a ridge regression. Thank you in
2007 Apr 17
1
value of complexity parameter in ridge regression
Hi, What is the optimum range to look for a value of lambda while doing ridge regression. Can/ should lambda be greater than 1 ? I have conflicting (or what appears conflicting to me) sources that use lambda >= 0, without any upper limit, but that makes the search space infinite.. right ?? So, perhaps my question is: is there an upper limit to lambda. Does the value of lambda convey
2009 Aug 19
1
ridge regression
Dear all, I considered an ordinary ridge regression problem. I followed three different ways: 1. estimate beta without any standardization 2. estimate standardized beta (standardizing X and y) and then again convert back 3. estimate beta using lm.ridge() function X<-matrix(c(1,2,9,3,2,4,7,2,3,5,9,1),4,3) y<-t(as.matrix(cbind(2,3,4,5))) n<-nrow(X) p<-ncol(X) #Without
2008 May 07
1
use of sequence on ridge regression
Dear R users. I have a doubt about the use of the sequence option on Ridge regression. I'm trying to understand the use of this option when variables are highly linear correlated. I'm running a model where the variables HtShoes and Ht have high VIF values. My program is written below, but I'm not sure about the correct way of using the sequence option: library (faraway) data (seatpos)
2009 Aug 19
1
Ridge regression [Repost]
Dear all, For an ordinary ridge regression problem, I followed three different approaches: 1. estimate beta without any standardization 2. estimate standardized beta (standardizing X and y) and then again convert back 3. estimate beta using lm.ridge() function X<-matrix(c(1,2,9,3,2,4,7,2,3,5,9,1),4,3) y<-as.matrix(c(2,3,4,5)) n<-nrow(X) p<-ncol(X) #Without standardization
2012 Dec 27
1
Ridge Regression variable selection
Unlike L1 (lasso) regression or elastic net (mixture of L1 and L2), L2 norm regression (ridge regression) does not select variables. Selection of variables would not work properly, and it's unclear why you would want to omit "apparently" weak variables anyway. Frank maths123 wrote > I have a .txt file containing a dataset with 500 samples. There are 10 > variables. > >
2007 Apr 12
1
Question on ridge regression with R
Hi, I am working on a project about hospital efficiency. Due to the high multicolinearlity of the data, I want to fit the model using ridge regression. However, I believe that the data from large hospital(indicated by the number of patients they treat a year) is more accurate than from small hosptials, and I want to put more weight on them. How do I do this with lm.ridge? I know I just need
2009 Dec 02
1
Ridge regression
Dear list, I have a couple of questions concerning ridge regression. I am using the lm.ridge(...) function in order to fit a model to my microarray data. Thus *model=lm.ridge(...)* I retrieve some coefficients and some scales for each gene. First of all, I would like to ask: the real coefficients of the model are not included in the first argument of the output but in the result of coef(model),
2005 Feb 16
2
R: ridge regression
hi all a technical question for those bright statisticians. my question involves ridge regression. definition: n=sample size of a data set X is the matrix of data with , say p variables Y is the y matrix i.e the response variable Z(i,j) = ( X(i,j)- xbar(j) / [ (n-1)^0.5* std(x(j))] Y_new(i)=( Y(i)- ybar(j) ) / [ (n-1)^0.5* std(Y(i))] (note that i have scaled the Y matrix as well) k is
2017 Oct 31
0
lasso and ridge regression
Dear All The problem is about regularization methods in multiple regression when the independent variables are collinear. A modified regularization method with two tuning parameters l1 and l2 and their product l1*l2 (Lambda 1 and Lambda 2) such that l1 takes care of ridge property and l2 takes care of LASSO property is proposed The proposed method is given
2013 Apr 30
0
Ridge regression
Hi all, I have run a ridge regression on a data set 'final' as follows: reg=lm.ridge(final$l~final$lag1+final$lag2+final$g+final$u, lambda=seq(0,10,0.01)) Then I enter : select(reg) and it returns: modified HKB estimator is 19.3409 modified L-W estimator is 36.18617 smallest value of GCV at 10 I think it
2010 Jan 06
0
parcor 0.2-2 - Regularized Partial Correlation Matrices with (adaptive) Lasso, PLS, and Ridge Regression
Dear R-users, we are happy to announce the release of our R package parcor. The package contains tools to estimate the matrix of partial correlations based on different regularized regression methods: Lasso, adaptive Lasso, PLS, and Ridge Regression. In addition, parcor provides cross-validation based model selection for Lasso, adaptive Lasso and Ridge Regression. More details can be found
2010 Jan 06
0
parcor 0.2-2 - Regularized Partial Correlation Matrices with (adaptive) Lasso, PLS, and Ridge Regression
Dear R-users, we are happy to announce the release of our R package parcor. The package contains tools to estimate the matrix of partial correlations based on different regularized regression methods: Lasso, adaptive Lasso, PLS, and Ridge Regression. In addition, parcor provides cross-validation based model selection for Lasso, adaptive Lasso and Ridge Regression. More details can be found
2003 Jun 05
2
ridge regression
Hello R-user I want to compute a multiple regression but I would to include a check for collinearity of the variables. Therefore I would like to use a ridge regression. I tried lm.ridge() but I don't know yet how to get p-values (single Pr() and p of the whole model) out of this model. Can anybody tell me how to get a similar output like the summary(lm(...)) output? Or if there is