Displaying 20 results from an estimated 8000 matches similar to: "Any package which takes the likelihood function and do Lasso fitting?"
2010 Dec 06
2
How to get lasso fit coefficient(given penalty tuning parameter \lambda) using lars package
Hi, all,
I am using the lars package for lasso estimate. So I get a lasso
fit first:
lassofit = lars(x,y,type ="lasso",normalize=T, intercept=T)
Then I want to get coefficient with respect to a certain value of \lambda
(the tuning parameter), I know lars has three mode options c("step",
"fraction", "norm"), but can I use the \lambda value instead
2012 May 05
0
penalized quantile regression (rq.fit.lasso)
Dear all:
I have a question about how to get the optimal estimate of coefficients
using the penalized quantile regression (LASSO penalty in quantile
regression defined in Koenker 2005).
In R, I found both
rq(y ~ x, method="lasso",lambda = 30) and
rq.fit.lasso(x, y, tau = 0.5, lambda = 1, beta = .9995, eps = 1e-06)
can give the estimates. But, I didn't find a way using either of
2012 Jun 16
0
Selecting correlated predictors with LASSO
I'm using the package 'lars' in R with the following code:
> library(lars)
> set.seed(3)
> n <- 1000
> x1 <- rnorm(n)
> x2 <- x1+rnorm(n)*0.5
> x3 <- rnorm(n)
> x4 <- rnorm(n)
> x5 <- rexp(n)
> y <- 5*x1 + 4*x2 + 2*x3 + 7*x4 + rnorm(n)
> x <- cbind(x1,x2,x3,x4,x5)
> cor(cbind(y,x))
y x1 x2
2011 Jun 06
1
Lasso for k-subset regression
Dear R-users
I'm trying to use lasso in lars package for subset regression, I have a
large matrix of size 1000x100 and my aim is to select a subset k of the 100
variables.
Is there any way in lars to fix the number k (i.e. to select the best 10
variables)
library(lars)
aa=lars(X,Y,type="lasso",max.steps=200)
plot(aa,plottype="Cp")
aa$RSS
which.min(aa$RSS)
2010 Jan 06
0
parcor 0.2-2 - Regularized Partial Correlation Matrices with (adaptive) Lasso, PLS, and Ridge Regression
Dear R-users,
we are happy to announce the release of our R package parcor.
The package contains tools to estimate the matrix of partial
correlations based on different regularized regression methods: Lasso,
adaptive Lasso, PLS, and Ridge Regression. In addition, parcor provides
cross-validation based model selection for Lasso, adaptive Lasso and
Ridge Regression.
More details can be found
2010 Jan 06
0
parcor 0.2-2 - Regularized Partial Correlation Matrices with (adaptive) Lasso, PLS, and Ridge Regression
Dear R-users,
we are happy to announce the release of our R package parcor.
The package contains tools to estimate the matrix of partial
correlations based on different regularized regression methods: Lasso,
adaptive Lasso, PLS, and Ridge Regression. In addition, parcor provides
cross-validation based model selection for Lasso, adaptive Lasso and
Ridge Regression.
More details can be found
2017 Jul 28
0
Need help on the Lasso cox model with discrete time
Hi everyone,
We have been trying to construct a Lasso-cox model with discrete time. We conducted follow-up examinations on the epileptic attack after tumor surgical resection among glioma patients. The patients are followed-up in the 6/12/24 months after surgical resection, which makes the epilepsy-free time discrete (6/12/24 months). We calcluated many features from the T2-images
2017 Oct 31
0
lasso and ridge regression
Dear All
The problem is about regularization methods in multiple regression when the
independent variables are collinear. A modified regularization method with
two tuning parameters l1 and l2 and their product l1*l2 (Lambda 1 and
Lambda 2) such that l1 takes care of ridge property and l2 takes care of
LASSO property is proposed
The proposed method is given
2011 May 28
1
Questions regrading the lasso and glmnet
Hi all. Sorry for the long email. I have been trying to find someone local to work on this with me, without much luck. I went in to our local stats consulting service here, and the guy there told me that I already know more about model selection than he does. :-< He pointed me towards another professor that can perhaps help, but that prof is busy until mid-June, so I want to get as much
2007 Jun 12
1
LASSO coefficients for a specific s
Hello,
I have a question about the lars package. I am using this package to get the coefficients at a specific LASSO parameter s.
data(diabetes)
attach(diabetes)
object <- lars(x,y,type="lasso")
cvres<-cv.lars(x,y,K=10,fraction = seq(from = 0, to = 1, length = 100))
fits <- predict.lars(object, type="coefficients", s=0.1, mode="fraction")
Can I assign
2012 Jun 05
1
Piecewise Lasso Regression
Hi All,
I am trying to fit a piecewise lasso regression, but package Segmented does not work with Lars objects.
Does any know of any package or implementation of piecewise lasso regression?
Thanks,
Lucas
2007 Aug 02
2
lasso/lars error
I'm having the exact problem outlined in a previous post from 2005 -
unfortunately the post was never answered:
http://tolstoy.newcastle.edu.au/R/help/05/10/15055.html
When running:
lm2=lars(x2,y,type="lasso",use.Gram=F)
I get an error:
Error in if (zmin < gamhat) { : missing value where TRUE/FALSE needed
...when running lasso via lars() on a 67x3795 set of predictors. I
2012 Mar 16
0
How to interpret glmnet lasso error
I get an error when I try to use glmnet to fit a lasso model on some data.
My code:
> lasso <- glmnet(predictorPartitionTrainingM, targetPartitionTraining,
alpha=1)
The error that is returned:
Error in elnet(x, is.sparse, ix, jx, y, weights, offset, type.gaussian, :
NA/NaN/Inf in foreign function call (arg 5)
Some potentially important details:
- 50 predictor variables
- 300
2006 Nov 23
0
lasso for AFT model
Hi all,
I want to apply lasso method in AFT model. can anybody help me how to get
lasso estimate using AFT model.
Hossain
[[alternative HTML version deleted]]
2017 Jul 08
0
Zero inflated Binomial Lasso
Hi R helpers,
I have a problem on a zero-inflated binomial distribution.
I have many regressions and few observations for which I wanted to apply
the LASSO regression.
Is there a package that allows the ZIB-Lasso?
Thank you very much!
[[alternative HTML version deleted]]
2006 Aug 14
1
lasso for variable selection
For "importance" it's probably best to stick with absolute values of
coefficients, instead of value of the penalty parameter for which the
coefficients changed to non-zero.
Friedman skipped a lot of details on his rule ensemble in that talk, due to
time constraint. In his implementation he was using his own algorithm,
PathSeeker, for which paper and software are available on his
2009 Dec 16
0
lasso regression coefficients
Dear list,
I have been trying to apply a simple lasso regression on a 10-element
vector, just to see how this method works so as to later implement it on
larger datasets. I thus create an input vector x:
* x=rnorm(10)*
I add some noise
* noise=runif(n=10, min=-0.1, max=0.1)*
and I create a simple linear model which calculates my output vector y
* y=2*x+1+noise*
I then do
2013 May 04
2
Lasso Regression error
Hi all,
I have a data set containing variables LOSS, GDP, HPI and UE.
(I have attached it in case it is required).
Having renamed the variables as l,g,h and u, I wish to run a Lasso
Regression with l as the dependent variable and all the other 3 as the
independent variables.
data=read.table("data.txt", header=T)
l=data$LOSS
h=data$HPI
u=data$UE
g=data$GDP
matrix=data.frame(l,g,h,u)
2009 Aug 21
1
LASSO: glmpath and cv.glmpath
Hi,
perhaps you can help me to find out, how to find the best Lambda in a
LASSO-model.
I have a feature selection problem with 150 proteins potentially
predicting Cancer or Noncancer. With a lasso model
fit.glm <- glmpath(x=as.matrix(X), y=target, family="binomial")
(target is 0, 1 <- Cancer non cancer, X the proteins, numerical in
expression), I get following path (PICTURE
2012 Mar 27
2
lasso constraint
In the package lasso2, there is a Prostate Data. To find coefficients in the
prostate cancer example we could impose L1 constraint on the parameters.
code is:
data(Prostate)
p.mean <- apply(Prostate, 5,mean)
pros <- sweep(Prostate, 5, p.mean, "-")
p.std <- apply(pros, 5, var)
pros <- sweep(pros, 5, sqrt(p.std),"/")
pros[, "lpsa"] <-