similar to: predict.gl1ce question

Displaying 9 results from an estimated 9 matches similar to: "predict.gl1ce question"

2007 Jul 25
1
question on using "gl1ce" from "lasso2" package
Hi, I tried several settings by using the "family=gaussian" in "gl1ce", but none of them works. For the case "glm" can work. Here is the error message I got: > glm(Petal.Width~Sepal.Length+Sepal.Width+Petal.Length ,data=iris,family=gaussian()) > gl1ce(Petal.Width~Sepal.Length+Sepal.Width+Petal.Length ,data=iris,family=gaussian()) Error in eval(expr, envir,
2007 Nov 09
1
help with lasso2 package
X is a matrix and F is a vector. F2 <- data.frame(cbind(X,F)) F2 V1 V2 V3 F 1 -0.250536332 -1.4755883 1.9580974 -2.136487 2 -0.009856084 0.4953269 0.5486092 -2.744482 3 -0.406962682 0.7729631 0.1861905 -2.891821 4 1.938780097 0.7469251 1.2537781 -1.212992 5 -0.332370358 1.1943637 0.7114278 -1.830441 modF<-formula(F ~ V1 + V2 + V3) #no error message
2004 May 11
1
How to use c routines in the exiting package?
Hi all, I want to know some details about the c routine “lasso” in the functions of “gl1ce()” . However, I have following troubles. First, I can not find the routine in the local directories of this function (or package). Second, if I found the routine, could I call it just like this way, say, fit <- .C("lasso", …,PACKAGE = "lasso2") in my own functions. My system is
2011 Sep 19
1
Constrained regressions (suggestions welcome)
All, Could anyone recommend a package that allows the user to constrain the coefficients from a multiple regression equation? I tried using the gl1ce function in lasso2, but couldn't get it to work. I created a contrived example to illustrate my starting point. data(cars) fmla <- formula(dist ~ speed) gl1c.E <- gl1ce(fmla, data = cars) gl1c.E gl1c.E <- gl1ce(fmla, data =
2006 May 09
1
Question about match.fun()
Dear all, I was recently contacted by a user about an alledged problem/bug in the latest version of lasso2. After some investigation, we found out that it was a user error which boils down to the following: > x <- matrix(rnorm(200), ncol=2) > var <- "fred" > apply(x, 2, var) Error in get(x, envir, mode, inherits) : variable "fred" of mode "function"
2009 Apr 02
2
all subsets for glm
Dear R-users, For the purpose of model selection I am looking for a way to exhaustively (and efficiently) search for best subsets of predictor variables for a logistic regression model. I am looking for something like leaps() but that works with glm. Any feedback highly appreciated. -- Harald von Waldow <hvwaldow at chem.ethz.ch> Safety and Environmental Technology Group Institute for
2007 May 18
0
Cross-validation for logistic regression with lasso2
Hello, I am trying to shrink the coefficients of a logistic regression for a sparse dataset, I am using the lasso (lasso2) and I am trying to determine the shrinkinage factor by cross-validation. I would like please some of the experts here to tell me whether i'm doing it correctly or not. Below is my dataset and the functions I use w= a b c d e P A 0 0 0 0 0 1 879 1 0 0 0 0 1 3 0 1 0 0 0 7 7
2005 Nov 04
1
small bug in gl1ce, package lasso2 (PR#8280)
Full_Name: Grant Izmirlian Version: 2.2.0 OS: SuSe Linux version 9.2 Submission from: (NULL) (156.40.34.177) Sorry about the last submission, my bug-fix had an error in it because ifelse doesn't vectorize. I'll repost with the correct bug-fix. ------------------------------------------------------------------------------- The option exists to include all parameters, including the
2010 Apr 21
1
Best subset of models for glm.nb()
Dear List, I am looking for a function that will find the best subset of negative binomial models. I have a large data set with 15 variables that I am interested in. I want an easy way to run all possible models and find a subset of the "best" models that I can then look at in more detail. I have found two functions that seem to provide what I am looking for, but am not sure which