similar to: ridge regression

Displaying 20 results from an estimated 100 matches similar to: "ridge regression"

2009 Aug 19
1
Ridge regression [Repost]
Dear all, For an ordinary ridge regression problem, I followed three different approaches: 1. estimate beta without any standardization 2. estimate standardized beta (standardizing X and y) and then again convert back 3. estimate beta using lm.ridge() function X<-matrix(c(1,2,9,3,2,4,7,2,3,5,9,1),4,3) y<-as.matrix(c(2,3,4,5)) n<-nrow(X) p<-ncol(X) #Without standardization
2009 Jun 04
0
help needed with ridge regression and choice of lambda with lm.ridge!!!
Hi, I'm a beginner in the field, I have to perform the ridge regression with lm.ridge for many datasets, and I wanted to do it in an automatic way. In which way I can automatically choose lambda ? As said, right now I'm using lm.ridge MASS function, which I found quite simple and fast, and I've seen that among the returned values there are HKB estimate of the ridge constant and L-W
2009 Mar 17
1
Likelihood of a ridge regression (lm.ridge)?
Dear all, I want to get the likelihood (or AIC or BIC) of a ridge regression model using lm.ridge from the MASS library. Yet, I can't really find it. As lm.ridge does not return a standard fit object, it doesn't work with functions like e.g. BIC (nlme package). Is there a way around it? I would calculate it myself, but I'm not sure how to do that for a ridge regression. Thank you in
2011 Aug 06
0
ridge regression - covariance matrices of ridge coefficients
For an application of ridge regression, I need to get the covariance matrices of the estimated regression coefficients in addition to the coefficients for all values of the ridge contstant, lambda. I've studied the code in MASS:::lm.ridge, but don't see how to do this because the code is vectorized using one svd calculation. The relevant lines from lm.ridge, using X, Y are:
2010 Feb 16
1
survival - ratio likelihood for ridge coxph()
It seems to me that R returns the unpenalized log-likelihood for the ratio likelihood test when ridge regression Cox proportional model is implemented. Is this as expected? In the example below, if I am not mistaken, fit$loglik[2] is unpenalized log-likelihood for the final estimates of coefficients. I would expect to get the penalized log-likelihood. I would like to check if this is as expected.
2010 Dec 15
0
ridge() function and coxph
I need to add a belated acknowledgement to my prior comments. Thomas Lumley has also been a significant contributer to the survival code. Until I moved my development from Splus to R he was the primary maintainer of the R code, he tightened up a lot of the C code to make sure it would work on multiple architectures, and added several functions. Merging our two code bases back together was a
2010 Dec 02
0
survival - summary and score test for ridge coxph()
It seems to me that summary for ridge coxph() prints summary but returns NULL. It is not a big issue because one can calculate statistics directly from a coxph.object. However, for some reason the score test is not calculated for ridge coxph(), i.e score nor rscore components are not included in the coxph object when ridge is specified. Please find the code below. I use 2.9.2 R with 2.35-4 version
2006 Nov 03
0
R package/function for ridge logistic regression
Hi, Is there a R function/package which could do ridge regression for logistic regression? Thank you. -- Zheng
2008 Jan 28
0
[OT] - standard errors for parameter estimates under ridge regression and lasso?
Dear R community, I'm curious to know how people go about estimating standard errors for parameter estimates after model selection by ridge regression and the lasso. Do you have any practical or theoretical advice? Warmly, Andrew -- Andrew Robinson Department of Mathematics and Statistics Tel: +61-3-8344-9763 University of Melbourne, VIC 3010 Australia Fax:
2010 Jun 08
0
About lm.ridge in the MASS package
Hi, I had a questions about doing ridge regression in R. Why is it that when I try this on datasets with more predictors than samples (p>n) using lambda=0, it still finds coefficients for all predictors? I thought when lambda=0, it should be like ordinary regression and therefore not find coefficients for all due to singularity? I would greatly appreciate your help. Thank you, --James K.
2009 Aug 15
0
coefficient p-value in ridge regression
Hello. I'have a problem with RIDGE REGRESSION. I've used lm.ridge function to estimate coefficients of my model. Why in the summary of models not appears t value, Pr(>|t|) and significance stars? How I can calculate coefficient's p-value in ridge regression? Thanks! [[alternative HTML version deleted]]
2010 Jan 06
0
parcor 0.2-2 - Regularized Partial Correlation Matrices with (adaptive) Lasso, PLS, and Ridge Regression
Dear R-users, we are happy to announce the release of our R package parcor. The package contains tools to estimate the matrix of partial correlations based on different regularized regression methods: Lasso, adaptive Lasso, PLS, and Ridge Regression. In addition, parcor provides cross-validation based model selection for Lasso, adaptive Lasso and Ridge Regression. More details can be found
2010 Jan 06
0
parcor 0.2-2 - Regularized Partial Correlation Matrices with (adaptive) Lasso, PLS, and Ridge Regression
Dear R-users, we are happy to announce the release of our R package parcor. The package contains tools to estimate the matrix of partial correlations based on different regularized regression methods: Lasso, adaptive Lasso, PLS, and Ridge Regression. In addition, parcor provides cross-validation based model selection for Lasso, adaptive Lasso and Ridge Regression. More details can be found
2010 Apr 26
0
lm.ridge {MASS} intercept questions
I am trying to understand the code for lm.ridge from the MASS package. Here is the part I am having trouble understanding: if(Inter <- attr(Terms, "intercept")) { Xm <- colMeans(X[, -Inter]) Ym <- mean(Y) p <- p - 1 X <- X[, -Inter] - rep(Xm, rep(n, p)) Y <- Y - Ym } else Ym <- Xm <- NA Xscale <- drop(rep(1/n, n) %*% X^2)^0.5 X <- X/rep(Xscale, rep.int(n,
2017 Oct 31
0
lasso and ridge regression
Dear All The problem is about regularization methods in multiple regression when the independent variables are collinear. A modified regularization method with two tuning parameters l1 and l2 and their product l1*l2 (Lambda 1 and Lambda 2) such that l1 takes care of ridge property and l2 takes care of LASSO property is proposed The proposed method is given
2013 Apr 16
1
[Ping:] [Patch] iso9660.c did not copy terminating 0 of Rock Ridge name
Hi, i cannot yet see my most recent bug fix patch applied to http://git.kernel.org/cgit/boot/syslinux/syslinux.git/log/?h=rockridge The fixed problem could lead to memory faults. http://www.syslinux.org/archives/2013-April/019790.html Have a nice day :) Thomas
2013 Apr 25
1
[syslinux:rockridge] iso9660.c did not copy terminating 0 of Rock Ridge name
On 04/25/2013 07:03 AM, syslinux-bot for Thomas Schmitt wrote: > Commit-ID: 5de463f724da515fd6c5ea49ded6dde178362181 > Gitweb: http://www.syslinux.org/commit/5de463f724da515fd6c5ea49ded6dde178362181 > Author: Thomas Schmitt <scdbackup at gmx.net> > AuthorDate: Thu, 4 Apr 2013 20:02:37 +0200 > Committer: Matt Fleming <matt.fleming at intel.com> > CommitDate:
2008 Feb 14
0
GCV in lm.ridge (MASS) (PR#10755)
Full_Name: Andrew Robinson Version: 2.6.2 Patched (2008-02-12 r44439) OS: FreeBSD 6.3-RC1 Submission from: (NULL) (211.28.206.186) I believe that the computation for GCV is incorrect in the lm.ridge function in MASS. >From lm.ridge: GCV <- colSums((Y - X %*% coef)^2)/ (n - colSums(matrix(d^2/div, dx)))^2 The denominator does not tally with the formula on p. 141 of Ripley's
2007 Apr 17
1
value of complexity parameter in ridge regression
Hi, What is the optimum range to look for a value of lambda while doing ridge regression. Can/ should lambda be greater than 1 ? I have conflicting (or what appears conflicting to me) sources that use lambda >= 0, without any upper limit, but that makes the search space infinite.. right ?? So, perhaps my question is: is there an upper limit to lambda. Does the value of lambda convey
2008 May 06
1
ridge regression
Thanks to all of you that helped me with the issues of bootstrapping and downloading packages to a local disk. As an starter I'm in the lower side of the learning curve, but this R software is awesome. What I like most is this kind of forums when people share their problems and we can find solutions. My new inquiry is short: do any of you have some example (besides the one that appears in