similar to: Question on ridge regression with R

Displaying 20 results from an estimated 100 matches similar to: "Question on ridge regression with R"

2007 Oct 30
1
Some matrix and sandwich questions
Dear R-help, I have a four-part question about regression, matrices, and sandwich package. 1) In the sandwich package, I would like to better understand the meat() function. >From the bread() documentation, for a simple OLS regression, bread() returns (1/n * X'X)^(-1) That is, for a simple regression (per the documentation on bread()): MyLM <- lm(y ~ x) bread(MyLM)
2009 Mar 02
2
R-help
Hi list, When I type my question in R console using ? sign (example: ?par, when I want to query for par), the following error message popps up: Error in print.help_files_with_topic("C:/PROGRA~1/R/R-24~1.1/library/maps/chm/map") : CHM file could not be displayed I appreciate if any body comes back to me with the solution. Regards, Alireza [[alternative HTML version deleted]]
2006 May 03
1
Inverse X'WX matrix from weighted linear regression
Dear list, how can I compute the inverse of the X'WX matrix ("inverse of the weighted sum of squares and crossproducts matrix") from an object of class "lm" from a weigthed linear regression? Thanks, Sven
2004 Mar 25
1
g-inverse question
I am using the ginv function from MASS and have run across this problem that I do not understand. If I define the matrix A as below, its g-inverse does not satisfy the Moore-Penrose condition A %*% ginv(A) %*% A = A. The matrix A is X'WX in a quadratic regression using some very large dollar values. The much simpler matrix B does satisfy the MP condition. Am I doing something wrong? Is
2007 Aug 14
1
cov.unscaled in gls object
Hi list, can I extract the cov.unscaled ("the unscaled covariance matrix") from a gls fit (package nlme), like with summary.lm? Background: In a fixed effect meta analysis regression the standard errors of the coefficients can be computed as sqrt(diag(cov.unscaled)) where cov.unscaled is (X'WX). I try do do this with a gls-fit. Thanks, Sven
2006 Jun 09
1
X'W in Matrix
Hi! I have used the Matrix package (Version: 0.995-10) successfully to obtain the OLS solution for a problem where the design matrix X is 44000x6000. X is very sparse (about 80000 non-zeros elements). Now I want to do WLS: (X'WX)^-1X'Wy I tried W=Diagonal(length(w),w) and wX=solve(X,W) but after various minutes R gives a not enough memory error (Im using a 64bit machine with 16Gigs
2007 Oct 19
2
In a SLR, Why Does the Hat Matrix Depend on the Weights?
I understand that the hat matrix is a function of the predictor variable alone. So, in the following example why do the values on the diagonal of the hat matrix change when I go from an unweighted fit to a weighted fit? Is the function hatvalues giving me something other than what I think it is? library(ISwR) data(thuesen) attach(thuesen) fit <- lm(short.velocity ~ blood.glucose)
2009 Jun 04
0
help needed with ridge regression and choice of lambda with lm.ridge!!!
Hi, I'm a beginner in the field, I have to perform the ridge regression with lm.ridge for many datasets, and I wanted to do it in an automatic way. In which way I can automatically choose lambda ? As said, right now I'm using lm.ridge MASS function, which I found quite simple and fast, and I've seen that among the returned values there are HKB estimate of the ridge constant and L-W
2009 Mar 17
1
Likelihood of a ridge regression (lm.ridge)?
Dear all, I want to get the likelihood (or AIC or BIC) of a ridge regression model using lm.ridge from the MASS library. Yet, I can't really find it. As lm.ridge does not return a standard fit object, it doesn't work with functions like e.g. BIC (nlme package). Is there a way around it? I would calculate it myself, but I'm not sure how to do that for a ridge regression. Thank you in
2011 Aug 06
0
ridge regression - covariance matrices of ridge coefficients
For an application of ridge regression, I need to get the covariance matrices of the estimated regression coefficients in addition to the coefficients for all values of the ridge contstant, lambda. I've studied the code in MASS:::lm.ridge, but don't see how to do this because the code is vectorized using one svd calculation. The relevant lines from lm.ridge, using X, Y are:
2005 Feb 01
3
polynomials REML and ML in nlme
Hello everyone, I hope this is a fair enough question, but I don’t have access to a copy of Bates and Pinheiro. It is probably quite obvious but the answer might be of general interest. If I fit a fixed effect with an added quadratic term and then do it as an orthogonal polynomial using maximum likelihood I get the expected result- they have the same logLik.
2010 Feb 16
1
survival - ratio likelihood for ridge coxph()
It seems to me that R returns the unpenalized log-likelihood for the ratio likelihood test when ridge regression Cox proportional model is implemented. Is this as expected? In the example below, if I am not mistaken, fit$loglik[2] is unpenalized log-likelihood for the final estimates of coefficients. I would expect to get the penalized log-likelihood. I would like to check if this is as expected.
2010 Dec 15
0
ridge() function and coxph
I need to add a belated acknowledgement to my prior comments. Thomas Lumley has also been a significant contributer to the survival code. Until I moved my development from Splus to R he was the primary maintainer of the R code, he tightened up a lot of the C code to make sure it would work on multiple architectures, and added several functions. Merging our two code bases back together was a
2010 Dec 02
0
survival - summary and score test for ridge coxph()
It seems to me that summary for ridge coxph() prints summary but returns NULL. It is not a big issue because one can calculate statistics directly from a coxph.object. However, for some reason the score test is not calculated for ridge coxph(), i.e score nor rscore components are not included in the coxph object when ridge is specified. Please find the code below. I use 2.9.2 R with 2.35-4 version
2006 Nov 03
0
R package/function for ridge logistic regression
Hi, Is there a R function/package which could do ridge regression for logistic regression? Thank you. -- Zheng
2008 Jan 28
0
[OT] - standard errors for parameter estimates under ridge regression and lasso?
Dear R community, I'm curious to know how people go about estimating standard errors for parameter estimates after model selection by ridge regression and the lasso. Do you have any practical or theoretical advice? Warmly, Andrew -- Andrew Robinson Department of Mathematics and Statistics Tel: +61-3-8344-9763 University of Melbourne, VIC 3010 Australia Fax:
2010 Jun 08
0
About lm.ridge in the MASS package
Hi, I had a questions about doing ridge regression in R. Why is it that when I try this on datasets with more predictors than samples (p>n) using lambda=0, it still finds coefficients for all predictors? I thought when lambda=0, it should be like ordinary regression and therefore not find coefficients for all due to singularity? I would greatly appreciate your help. Thank you, --James K.
2009 Aug 15
0
coefficient p-value in ridge regression
Hello. I'have a problem with RIDGE REGRESSION. I've used lm.ridge function to estimate coefficients of my model. Why in the summary of models not appears t value, Pr(>|t|) and significance stars? How I can calculate coefficient's p-value in ridge regression? Thanks! [[alternative HTML version deleted]]
2010 Jan 06
0
parcor 0.2-2 - Regularized Partial Correlation Matrices with (adaptive) Lasso, PLS, and Ridge Regression
Dear R-users, we are happy to announce the release of our R package parcor. The package contains tools to estimate the matrix of partial correlations based on different regularized regression methods: Lasso, adaptive Lasso, PLS, and Ridge Regression. In addition, parcor provides cross-validation based model selection for Lasso, adaptive Lasso and Ridge Regression. More details can be found
2010 Jan 06
0
parcor 0.2-2 - Regularized Partial Correlation Matrices with (adaptive) Lasso, PLS, and Ridge Regression
Dear R-users, we are happy to announce the release of our R package parcor. The package contains tools to estimate the matrix of partial correlations based on different regularized regression methods: Lasso, adaptive Lasso, PLS, and Ridge Regression. In addition, parcor provides cross-validation based model selection for Lasso, adaptive Lasso and Ridge Regression. More details can be found