similar to: questions with library lars()

Displaying 20 results from an estimated 10000 matches similar to: "questions with library lars()"

2011 May 24
1
seeking help on using LARS package
Hi, I am writing to seek some guidance regarding using Lasso regression with the R package LARS. I have introductory statistics background but I am trying to learn more. Right now I am trying to duplicate the results in a paper for shRNA prediction "An accurate and interpretable model for siRNA efficacy prediction, Jean-Philippe Vert et. al, Bioinformatics" for a Bioinformatics project
2007 May 17
0
New version 0.9-7 of lars package
I uploaded a new version of the lars package to CRAN, which incorporates some nontrivial changes. 1) lars now has normalize and intercept options, both defaulted to TRUE, which means the variables are scaled to have unit euclidean norm, and an intercept is included in the model. Either or both can be set to FALSE. 2) lars has an additional type = "stepwise" option; now the list is
2007 May 17
0
New version 0.9-7 of lars package
I uploaded a new version of the lars package to CRAN, which incorporates some nontrivial changes. 1) lars now has normalize and intercept options, both defaulted to TRUE, which means the variables are scaled to have unit euclidean norm, and an intercept is included in the model. Either or both can be set to FALSE. 2) lars has an additional type = "stepwise" option; now the list is
2007 Aug 02
2
lasso/lars error
I'm having the exact problem outlined in a previous post from 2005 - unfortunately the post was never answered: http://tolstoy.newcastle.edu.au/R/help/05/10/15055.html When running: lm2=lars(x2,y,type="lasso",use.Gram=F) I get an error: Error in if (zmin < gamhat) { : missing value where TRUE/FALSE needed ...when running lasso via lars() on a 67x3795 set of predictors. I
2003 Apr 30
0
Least Angle Regression packages for R
Least Angle Regression software: LARS "Least Angle Regression" ("LAR") is a new model selection algorithm; a useful and less greedy version of traditional forward selection methods. LAR is described in detail in a paper by Brad Efron, Trevor Hastie, Iain Johnstone and Rob Tibshirani, soon to appear in the Annals of Statistics. The paper, as well as R and Splus packages, are
2003 Apr 30
0
Least Angle Regression packages for R
Least Angle Regression software: LARS "Least Angle Regression" ("LAR") is a new model selection algorithm; a useful and less greedy version of traditional forward selection methods. LAR is described in detail in a paper by Brad Efron, Trevor Hastie, Iain Johnstone and Rob Tibshirani, soon to appear in the Annals of Statistics. The paper, as well as R and Splus packages, are
2007 Jun 12
1
LASSO coefficients for a specific s
Hello, I have a question about the lars package. I am using this package to get the coefficients at a specific LASSO parameter s. data(diabetes) attach(diabetes) object <- lars(x,y,type="lasso") cvres<-cv.lars(x,y,K=10,fraction = seq(from = 0, to = 1, length = 100)) fits <- predict.lars(object, type="coefficients", s=0.1, mode="fraction") Can I assign
2008 Oct 23
1
lars
I am trying to use the lars package in R to carry out lasso analysis. However, I am having some problems. Please could you help me with the following questions: 1) Exactly what format do x and y need to be in for cv.lars(x, y) and lars (x, y)? And what information do x and y need to contain exactly? I have tried using to test just a simple matrix of numeric values for x and a simple vector of
2012 Jun 16
0
Selecting correlated predictors with LASSO
I'm using the package 'lars' in R with the following code: > library(lars) > set.seed(3) > n <- 1000 > x1 <- rnorm(n) > x2 <- x1+rnorm(n)*0.5 > x3 <- rnorm(n) > x4 <- rnorm(n) > x5 <- rexp(n) > y <- 5*x1 + 4*x2 + 2*x3 + 7*x4 + rnorm(n) > x <- cbind(x1,x2,x3,x4,x5) > cor(cbind(y,x)) y x1 x2
2007 Mar 15
1
Model selection in LASSO (cross-validation)
Hi, I know how to use LASSO for model selection based on the Cp criterion. I heard that we can also use cross validation as a criterion too. I used cv.lars to give me the lowest predicted error & fraction. But I'm short of a step to arrive at the number of variables to be included in the final model. How do we do that? Is it the predict.lars function? i tried >
2011 Jul 12
7
FW: lasso regression
Hi, I am trying to do a lasso regression using the lars package with the following data (see attached): FastestTime WinPercentage PlacePercentage ShowPercentage BreakAverage FinishAverage Time7Average Time3Average Finish 116.90 0.14 0.14 0.29 4.43 3.29 117.56 117.77 5.00 116.23 0.29 0.43 0.14 6.14 2.14 116.84 116.80 2.00 116.41 0.00 0.14 0.29 5.71 3.71 117.24
2009 Jul 12
0
Plotting problem [lars()/elasticnet()]
Dear all, I am using modified LARS algorithm (ref: The Adaptive Lasso and Its Oracle Properties, Zou 2006) for adaptive lasso penalized linear regression. 1. w(j) <- |beta_ols(j)|^(-gamma) gamma>0 and j = 1,...,p 2. define x_new(j) <- x(j)*w(j) 3. apply LARS to solve modified lasso problem out.adalasso <- lars(X_new,y,type="lasso") or enet(X_new,
2010 Dec 06
2
How to get lasso fit coefficient(given penalty tuning parameter \lambda) using lars package
Hi, all, I am using the lars package for lasso estimate. So I get a lasso fit first: lassofit = lars(x,y,type ="lasso",normalize=T, intercept=T) Then I want to get coefficient with respect to a certain value of \lambda (the tuning parameter), I know lars has three mode options c("step", "fraction", "norm"), but can I use the \lambda value instead
2005 May 31
3
lars / lasso with glm
We have been using Least Angle Regression (lars) to help identify predictors in models where the outcome is continuous. To do so we have been relying on the lars package. Theoretically, it should be possible to use the lars procedure within a general linear model (glm) framework - we are particular interested in a logistic regression model. Does anyone have examples of using lars with logistic
2007 Sep 19
1
Strange behaviour of lars method
Hi! When I apply the lars (least-angle-regression) method to my data (3655 features, only 355 data points, no I did not mistype), I observe a strange behaviour: 1) The beta values tend to grow into real high values quite fast up to a point where they overflow and get negative. The overflow is not a problem, I don't need the last part of the analysis anyway, but why do they just
2011 May 28
1
Questions regrading the lasso and glmnet
Hi all. Sorry for the long email. I have been trying to find someone local to work on this with me, without much luck. I went in to our local stats consulting service here, and the guy there told me that I already know more about model selection than he does. :-< He pointed me towards another professor that can perhaps help, but that prof is busy until mid-June, so I want to get as much
2013 Jul 17
1
glmnet on Autopilot
Dear List, I'm running simulations using the glmnet package. I need to use an 'automated' method for model selection at each iteration of the simulation. The cv.glmnet function in the same package is handy for that purpose. However, in my simulation I have p >> N, and in some cases the selected model from cv.glmet is essentially shrinking all coefficients to zero. In this case,
2006 Mar 02
0
glmpath (new version 0.91)
We have uploaded to CRAN a new version of glmpath, a package which fits the L1 regularization path for generalized linear models. The revision includes: - coxpath, a function for fitting the L1-regularization path for the Cox ph model; - bootstrap functions for analyzing sparse solutions; - the ability to mix in L2 regularization along with L1 (elasticnet). We have also completed a report that
2006 Mar 02
0
glmpath (new version 0.91)
We have uploaded to CRAN a new version of glmpath, a package which fits the L1 regularization path for generalized linear models. The revision includes: - coxpath, a function for fitting the L1-regularization path for the Cox ph model; - bootstrap functions for analyzing sparse solutions; - the ability to mix in L2 regularization along with L1 (elasticnet). We have also completed a report that
2014 Jun 23
0
LASSO coefficients for a specific s
Hi Bruno, You advised following no need to find the best s value. CV does it for you: cvres<-cv.lars(X,Y,K=10,type='lasso') sAtBest<-cvres$fraction[which.min(cvres$cv)] fits <- predict.lars(object, type="coefficients", s=sAtBest, mode="fraction") in thread https://stat.ethz.ch/pipermail/r-help/2007-June/133982.html My problem is that, cvres$cv is non empty