Displaying 20 results from an estimated 70000 matches similar to: "ANOVA for glmnet"
2012 May 28
0
GLMNET AUC vs. MSE
Hello -
I am using glmnet to generate a model for multiple cohorts i. For each i, I
run 5 separate models, each with a different x variable. I want to compare
the fit statistic for each i and x combination.
When I use auc, the output is in some cases is < .5 (.49). In addition, if
I compare mean MSE (with upper and lower bounds) ... there is no difference
across my various x variables, but
2011 Jul 22
4
glmnet with binary logistic regression
Hi all,
I am using the glmnet R package to run LASSO with binary logistic
regression. I have over 290 samples with outcome data (0 for alive, 1 for
dead) and over 230 predictor variables. I currently using LASSO to reduce
the number of predictor variables.
I am using the cv.glmnet function to do 10-fold cross validation on a
sequence of lambda values which I let glmnet determine. I then take
2013 Jul 17
1
glmnet on Autopilot
Dear List,
I'm running simulations using the glmnet package. I need to use an
'automated' method for model selection at each iteration of the simulation.
The cv.glmnet function in the same package is handy for that purpose.
However, in my simulation I have p >> N, and in some cases the selected
model from cv.glmet is essentially shrinking all coefficients to zero. In
this case,
2011 May 01
1
Different results of coefficients by packages penalized and glmnet
Dear R users:
Recently, I learn to use penalized logistic regression. Two packages
(penalized and glmnet) have the function of lasso.
So I write these code. However, I got different results of coef. Can someone
kindly explain.
# lasso using penalized
library(penalized)
pena.fit2<-penalized(HRLNM,penalized=~CN+NoSus,lambda1=1,model="logistic",standardize=TRUE)
pena.fit2
2011 May 28
1
Questions regrading the lasso and glmnet
Hi all. Sorry for the long email. I have been trying to find someone local to work on this with me, without much luck. I went in to our local stats consulting service here, and the guy there told me that I already know more about model selection than he does. :-< He pointed me towards another professor that can perhaps help, but that prof is busy until mid-June, so I want to get as much
2009 Mar 17
1
- help - predicting with glmnet/lars for dataframes with different nrow then the train set
Hello
I'm having trouble using lars and glmnet functions to predict on a new data
set with different nrow then the original :
for instance:
=============
log.1 = glm(temp.data$TL~(.),temp.data,family = binomial,x=TRUE,y=TRUE)
nrow(test.data) != nrow(temp.data # == TRUE
Val.frame = model.frame(log.1,test.data) # returns a data frame with the
variables needed to use log.1
2011 Aug 10
2
glmnet
Hi All,
I have been trying to use glmnet package to do LASSO linear regression. my x data is a matrix n_row by n_col and y is a vector of size n_row corresponding to the vector data. The number of n_col is much more larger than the number of n_row. I do the following:
fits = glmnet(x, y, family="multinomial")I have been following this
2013 Jul 06
1
problem with BootCV for coxph in pec after feature selection with glmnet (lasso)
Hi,
I am attempting to evaluate the prediction error of a coxph model that was
built after feature selection with glmnet.
In the preprocessing stage I used na.omit (dataset) to remove NAs.
I reconstructed all my factor variables into binary variables with dummies
(using model.matrix)
I then used glmnet lasso to fit a cox model and select the best performing
features.
Then I fit a coxph model
2011 Dec 13
0
bug in glmnet 1.7.1 for multinomal when alpha=0?
Dear all,
If I am not mistaken, I think that I have found a bug in glmnet 1.7.1 (latest version) for multinomial when alpha=0. Here is the code
> library(glmnet)
Loading required package: Matrix
Loading required package: lattice
Loaded glmnet 1.7.1
> x=matrix(rnorm(40*500),40,500)
> g4=sample(1:7,40,replace=TRUE)
> fit=glmnet(x,g4,family="multinomial",alpha=0)
>
2012 May 07
1
estimating survival times with glmnet and coxph
Dear all,
I am using glmnet (Coxnet) for building a Cox Model and
to make actual prediction, i.e. to estimate the survival function S(t,Xn) for a
new subject Xn. If I am not mistaken, glmnet (coxnet) returns beta, beta*X and
exp(beta*X), which on its own cannot generate S(t,Xn). We miss baseline
survival function So(t).
Below is my code which takes beta coefficients from
glmnet and creates coxph
2011 Feb 17
1
cv.glmnet errors
Hi,
I am trying to do multinomial regression using the glmnet package, but the
following gives me an error (for no reason apparent to me):
library(glmnet)
cv.glmnet(x=matrix(c(1,2,3,4,5,6,1,2,3,4,5,6),
nrow=6),y=as.factor(c(1,2,1,2,3,3)),family='multinomial',alpha=0.5,
nfolds=2)
The error i get is:
Error in if (outlist$msg != "Unknown error") return(outlist) :
argument is of
2011 Mar 25
2
A question on glmnet analysis
Hi,
I am trying to do logistic regression for data of 104 patients, which
have one outcome (yes or no) and 15 variables (9 categorical factors
[yes or no] and 6 continuous variables). Number of yes outcome is 25.
Twenty-five events and 15 variables mean events per variable is much
less than 10. Therefore, I tried to analyze the data with penalized
regression method. I would like please some of the
2011 Aug 23
1
Glmnet lambda value choice
Hi,
When using the glmnet() function of the package glmnet, A series of coefficients is returned for a list of descending lambda values.
I am unable to locate anything in the documentation that explains HOW this choice of lambda series is made. (There is documentation about how to choose my own, but I want to understand how the authors are doing it)
Any ideas?
--
Noah Silverman
UCLA
2009 Oct 30
0
different L2 regularization behavior between lrm, glmnet, and penalized? (original question)
Dear Robert,
The differences have to do with diffent scaling defaults.
lrm by default standardizes the covariates to unit sd before applying
penalization. penalized by default does not do any standardization, but
if asked standardizes on unit second central moment. In your example:
x = c(-2, -2, -2, -2, -1, -1, -1, 2, 2, 2, 3, 3, 3, 3)
z = c(0, 0, 0, 1, 0, 0, 1, 0, 1, 1, 0, 1, 1, 1)
You
2009 Apr 07
1
R segfaulting with glmnet on some data, not other
Hello R-help list,
I have a piece of code written by a grad student here at BU which will segfault when using one data set, but complete just fine using another. Both sets are just text files full of real numbers.
It seems like a bug within R. It could be a bug within her data, but
again, her data is just a bunch of floats, so her data could be
triggering a bug within R. I have tried this
2010 Jul 08
1
glmnet - choosing the number of features
Hi,
I am trying to use the glmnet package to do some simple feature selection.
However, I would ideally like to be able to specify the number of features
to return (the glmnet package, as far as I can tell, only allows
specification of a regularization parameter, lambda, that in turn returns a
model with a specific number of non-zero features).
Is there a straightforward way of calculating the
2009 Oct 14
1
different L2 regularization behavior between lrm, glmnet, and penalized?
The following R code using different packages gives the same results for a
simple logistic regression without regularization, but different results
with regularization. This may just be a matter of different scaling of the
regularization parameters, but if anyone familiar with these packages has
insight into why the results differ, I'd appreciate hearing about it. I'm
new to
2013 Mar 02
0
glmnet 1.9-3 uploaded to CRAN (with intercept option)
This update adds an intercept option (by popular request) - now one can fit a model without an intercept
Glmnet is a package that fits the regularization path for a number of generalized linear models, with with "elastic net"
regularization (tunable mixture of L1 and L2 penalties). Glmnet uses pathwise coordinate descent, and is very fast.
The current list of models covered are:
2013 Mar 02
0
glmnet 1.9-3 uploaded to CRAN (with intercept option)
This update adds an intercept option (by popular request) - now one can fit a model without an intercept
Glmnet is a package that fits the regularization path for a number of generalized linear models, with with "elastic net"
regularization (tunable mixture of L1 and L2 penalties). Glmnet uses pathwise coordinate descent, and is very fast.
The current list of models covered are:
2012 Mar 21
1
glmnet() vs. lars()
dear all,
It appears that glmnet(), when "selecting" the covariates entering the
model, skips from K covariates, say, to K+2 or K+3. Thus 2 or 3
variables are "added" at the same time and it is not possible to obtain
a ranking of the covariates according to their importance in the model.
On the other hand lars() "adds" the covariates one at a time.
My question