similar to: GCV in lm.ridge (MASS) (PR#10755)

Displaying 20 results from an estimated 3000 matches similar to: "GCV in lm.ridge (MASS) (PR#10755)"

2013 Apr 27
1
Selecting ridge regression coefficients for minimum GCV
Hi all, I have run a ridge regression as follows: reg=lm.ridge(final$l~final$lag1+final$lag2+final$g+final$u, lambda=seq(0,10,0.01)) Then I enter : select(reg) and it returns: modified HKB estimator is 19.3409 modified L-W estimator is 36.18617 smallest value of GCV at 10 I think it means that it is advisable to
2011 Aug 23
1
obtaining p-values for lm.ridge() coefficients (package 'MASS')
Dear all I'm familiarising myself with Ridge Regressions in R and the following is bugging me: How does one get p-values for the coefficients obtained from MASS::lm.ridge() output (for a given lambda)? Consider the example below (adapted from PRA [1]): > require(MASS) > data(longley) > gr <- lm.ridge(Employed ~ .,longley,lambda = seq(0,0.1,0.001)) > plot(gr) > select(gr)
2009 Jun 04
0
help needed with ridge regression and choice of lambda with lm.ridge!!!
Hi, I'm a beginner in the field, I have to perform the ridge regression with lm.ridge for many datasets, and I wanted to do it in an automatic way. In which way I can automatically choose lambda ? As said, right now I'm using lm.ridge MASS function, which I found quite simple and fast, and I've seen that among the returned values there are HKB estimate of the ridge constant and L-W
2013 Apr 30
0
Ridge regression
Hi all, I have run a ridge regression on a data set 'final' as follows: reg=lm.ridge(final$l~final$lag1+final$lag2+final$g+final$u, lambda=seq(0,10,0.01)) Then I enter : select(reg) and it returns: modified HKB estimator is 19.3409 modified L-W estimator is 36.18617 smallest value of GCV at 10 I think it
2012 Dec 27
1
Ridge Regression variable selection
Unlike L1 (lasso) regression or elastic net (mixture of L1 and L2), L2 norm regression (ridge regression) does not select variables. Selection of variables would not work properly, and it's unclear why you would want to omit "apparently" weak variables anyway. Frank maths123 wrote > I have a .txt file containing a dataset with 500 samples. There are 10 > variables. > >
2007 Dec 27
2
Problem of lmer under FreeBSD
I encounter such problem with lmer under FreeBSD, but not under Windows. Anyone knows why? Thanks. > example(lmer) lmer> (fm1 <- lmer(Reaction ~ Days + (Days|Subject), sleepstudy)) Error in UseMethod("as.logical") : no applicable method for "as.logical" > traceback() 9: as.logical(EMverbose) 8: as.logical(EMverbose) 7: lmerControl() 6:
2007 Apr 08
1
Relative GCV - poisson and negbin GAMs (mgcv)
I am using gam in mgcv (1.3-22) and trying to use gcv to help with model selection. However, I'm a little confused by the process of assessing GCV scores based on their magnitude (or on relative changes in magnitude). Differences in GCV scores often seem "obvious" with my poisson gams but with negative binomial, the decision seems less clear. My data represent a similar pattern as
2010 Apr 26
0
lm.ridge {MASS} intercept questions
I am trying to understand the code for lm.ridge from the MASS package. Here is the part I am having trouble understanding: if(Inter <- attr(Terms, "intercept")) { Xm <- colMeans(X[, -Inter]) Ym <- mean(Y) p <- p - 1 X <- X[, -Inter] - rep(Xm, rep(n, p)) Y <- Y - Ym } else Ym <- Xm <- NA Xscale <- drop(rep(1/n, n) %*% X^2)^0.5 X <- X/rep(Xscale, rep.int(n,
2012 Aug 08
1
mgcv and gamm4: REML, GCV, and AIC
Hi, I've been using gamm4 to build GAMMs for exploring environmental influences on genetic ancestry. Things have gone well and I have 2 very straightforward questions: 1. I've used method=REML. Am I correct that this is an alternative method for estimating the smooth functions in GAMMs rather than GCV that is often used for GAMs? I've read up on REML and it makes sense, but I'm
2006 Dec 04
1
GAM model selection and dropping terms based on GCV
Hello, I have a question regarding model selection and dropping of terms for GAMs fitted with package mgcv. I am following the approach suggested in Wood (2001), Wood and Augustin (2002). I fitted a saturated model, and I find from the plots that for two of the covariates, 1. The confidence interval includes 0 almost everywhere 2. The degrees of freedom are NOT close to 1 3. The partial
2010 Jun 08
0
About lm.ridge in the MASS package
Hi, I had a questions about doing ridge regression in R. Why is it that when I try this on datasets with more predictors than samples (p>n) using lambda=0, it still finds coefficients for all predictors? I thought when lambda=0, it should be like ordinary regression and therefore not find coefficients for all due to singularity? I would greatly appreciate your help. Thank you, --James K.
2013 Apr 26
1
Regression coefficients
Hi all, I have run a ridge regression as follows: reg=lm.ridge(final$l~final$lag1+final$lag2+final$g+final$g+final$u, lambda=seq(0,10,0.01)) Then I enter : select(reg) and it returns: modified HKB estimator is 19.3409 modified L-W estimator is 36.18617 smallest value of GCV at 10 I think it means that it is
2004 Mar 12
1
GCV UBRE score in GAM models
hello to everybody: I would to know with ranges of GCV or UBRE values can be considered as adequate to consider a GAM as correct Thanks in advance -- David Nogu?s Bravo Functional Ecology and Biodiversity Department Pyrenean Institute of Ecology Spanish Research Council Av. Monta?ana 1005 Zaragoza - CP 50059 976716030 - 976716019 (fax)
2009 Mar 31
1
CV and GCV for finding smoothness parameter
I received an assignment that I have to do in R, but I'm absolutely not very good at it. The task is the following: http://www.nabble.com/file/p22804957/question8.jpg To do this, we also get the following pieces of code (not in correct order): http://www.nabble.com/file/p22804957/hints.jpg I'm terrible at this and I'm completely stuck. The model I chose can be found in here:
2007 Apr 17
1
value of complexity parameter in ridge regression
Hi, What is the optimum range to look for a value of lambda while doing ridge regression. Can/ should lambda be greater than 1 ? I have conflicting (or what appears conflicting to me) sources that use lambda >= 0, without any upper limit, but that makes the search space infinite.. right ?? So, perhaps my question is: is there an upper limit to lambda. Does the value of lambda convey
2011 Aug 06
0
ridge regression - covariance matrices of ridge coefficients
For an application of ridge regression, I need to get the covariance matrices of the estimated regression coefficients in addition to the coefficients for all values of the ridge contstant, lambda. I've studied the code in MASS:::lm.ridge, but don't see how to do this because the code is vectorized using one svd calculation. The relevant lines from lm.ridge, using X, Y are:
2009 Mar 17
1
Likelihood of a ridge regression (lm.ridge)?
Dear all, I want to get the likelihood (or AIC or BIC) of a ridge regression model using lm.ridge from the MASS library. Yet, I can't really find it. As lm.ridge does not return a standard fit object, it doesn't work with functions like e.g. BIC (nlme package). Is there a way around it? I would calculate it myself, but I'm not sure how to do that for a ridge regression. Thank you in
2010 Dec 02
0
survival - summary and score test for ridge coxph()
It seems to me that summary for ridge coxph() prints summary but returns NULL. It is not a big issue because one can calculate statistics directly from a coxph.object. However, for some reason the score test is not calculated for ridge coxph(), i.e score nor rscore components are not included in the coxph object when ridge is specified. Please find the code below. I use 2.9.2 R with 2.35-4 version
2009 Aug 14
1
Permutation test and R2 problem
Hi, I have optimized the shrinkage parameter (GCV)for ridge and got my r2 value is 70% . to check the sensitivity of the result, I did permutation test. I permuted the response vector and run for 1000 times and draw a distribution. But now, I get r2 values highest 98% and some of them more than 70 %. Is it expected from such type of test? *I was under impression that, r2 with real data set
2009 Aug 01
2
Cox ridge regression
Hello, I have questions regarding penalized Cox regression using survival package (functions coxph() and ridge()). I am using R 2.8.0 on Ubuntu Linux and survival package version 2.35-4. Question 1. Consider the following example from help(ridge): > fit1 <- coxph(Surv(futime, fustat) ~ rx + ridge(age, ecog.ps, theta=1), ovarian) As I understand, this builds a model in which `rx' is