Displaying 20 results from an estimated 30000 matches similar to: "building a formula for glm() with 30,000 independent variables"
2002 Nov 13
1
building a formula for glm() with 30,000 independent vari ables
Dear Prof. Ripley,
you mention the theory of perceptrons.
Could you please point me to an introduction paper or book?
Thanks in previous,
Dominik
> -----Original Message-----
> From: ripley at stats.ox.ac.uk [mailto:ripley at stats.ox.ac.uk]
> Sent: dimanche, 10. novembre 2002 18:55
> To: Ben Liblit
> Cc: r-help at stat.math.ethz.ch
> Subject: Re: [R] building a formula for
2004 Sep 22
5
Issue with predict() for glm models
[This email is either empty or too large to be displayed at this time]
2003 Apr 07
2
log-linear
hello
I have spatial data which contain
number of landslide presence cells with respect to landslide
predictors and
number of landslide absence cells with respect to same predictors.
predictors are essentially categorical data.
I tried logistic regression. But because of providing interaction
capability
of predictors, I want to use log-linear method.
I hesitate the way I should use
2005 May 27
1
logistic regression
Hi
I am working on corpora of automatically recognized utterances, looking
for features that predict error in the hypothesis the recognizer is
proposing.
I am using the glm functions to do logistic regression. I do this type
of thing:
* logistic.model = glm(formula = similarity ~., family = binomial,
data = data)
and end up with a model:
> summary(logistic.model)
Call:
2009 Jun 30
1
fitting in logistic model
I would like to know how R computes the probability of an event in a
logistic model (P(y=1)) from the score s, linear combination of x and
beta.
I noticed that there are differences (small, less than e-16) between the
fitting values automatically computed in the glm procedure by R, and the
values "manually" computed by me applying the reverse formula
p=e^s/(1+e^s); moreover I noticed
2005 Nov 08
1
Interpretation of output from glm
I am fitting a logistic model to binary data. The response variable is a
factor (0 or 1) and all predictors are continuous variables. The main
predictor is LT (I expect a logistic relation between LT and the
probability of being mature) and the other are variables I expect to modify
this relation.
I want to test if all predictors contribute significantly for the fit or not
I fit the full
2017 Jun 29
0
Help : glm p-values for a factor predictor
It might help if you provided the code you used. It's possible that
you didn't use direction="backward" in stepAIC(). Or if you did, it
was still running, so whatever else you try will still be slow. The
statement "R provides only the pvalues for each level" is wrong: look
at the anova() function.
Bob
On 29 June 2017 at 11:13, Beno?t PELE <benoit.pele at
2017 Jun 29
3
Help : glm p-values for a factor predictor
Hello,
i am a newby on R and i am trying to make a backward selection on a
binomial-logit glm on a large dataset (69000 lines for 145 predictors).
After 3 days working, the stepAIC function did not terminate. I do not
know if that is normal but i would like to try computing a "homemade"
backward with a repeated glm ; at each step, the predictor with the max
pvalue would be
2005 May 31
3
lars / lasso with glm
We have been using Least Angle Regression (lars) to help identify
predictors in models where the outcome is continuous. To do so we have
been relying on the lars package. Theoretically, it should be possible
to use the lars procedure within a general linear model (glm) framework
- we are particular interested in a logistic regression model. Does
anyone have examples of using lars with logistic
2010 Oct 22
2
Random Forest AUC
Guys,
I used Random Forest with a couple of data sets I had to predict for binary
response. In all the cases, the AUC of the training set is coming to be 1.
Is this always the case with random forests? Can someone please clarify
this?
I have given a simple example, first using logistic regression and then
using random forests to explain the problem. AUC of the random forest is
coming out to be
2002 Nov 10
1
binomial glm for relevant feature selection?
As suggested in my earlier message, I have a large population of
independent variables and a binary dependent outcome. It is expected
that only a few of the independent variables actually contribute to the
outcome, and I'd like to find those.
If it wasn't already obvious, I am *not* a statistician. Not even
close. :-) Statistician colleagues have suggested that I use logistic
2008 Mar 17
2
stepAIC and polynomial terms
Dear all,
I have a question regarding the use of stepAIC and polynomial (quadratic to be specific) terms in a binary logistic regression model. I read in McCullagh and Nelder, (1989, p 89) and as far as I remember from my statistics cources, higher-degree polynomial effects should not be included without the main effects. If I understand this correctly, following a stepwise model selection based
2006 May 22
1
Ordinal Independent Variables
When I run "lrm" from the Design package, I get a warning about
contrasts when I include an ordinal variable:
Warning message:
Variable ordfac is an ordered factor.
You should set
options(contrasts=c("contr.treatment","contr.treatment"))
or Design will not work properly. in: Design(eval(m, sys.parent()))
I don't get this message if I use glm with
2006 Nov 20
4
for help about logistic regression model
I have a dataset like this:
p aa
index x y z sdx sdy sdz delta as
ms cur sc
1 821p MET 1 -5.09688 32.8830 -5.857620 1.478200 1.73998 0.825778
13.7883 126.91 92.37 -0.1320180 111.0990
2 821p THR 2 -4.07357 28.6881 -4.838430 0.597674 1.37860 1.165780
13.7207 64.09 50.72 -0.0977129 98.5319
3 821p GLU 3 -5.86733 30.4759
2011 Nov 25
1
variable types - logistic regression
Hello,
Is there an example out there that shows how to treat each of the predictor
variable types when doing logistic regression in R? Something like this:
glm(y~x1+x2+x3+x4, data=mydata, family=binomial(link="logit"),
na.action=na.pass)
I'm drawing mostly from:
http://www.ats.ucla.edu/stat/r/dae/logit.htm
...but there are only two types of variable in the example given. I'm
2007 Jun 12
3
Appropriate regression model for categorical variables
Dear users,
In my psychometric test i have applied logistic regression on my data. My
data consists of 50 predictors (22 continuous and 28 categorical) plus a
binary response.
Using glm(), stepAIC() i didn't get satisfactory result as misclassification
rate is too high. I think categorical variables are responsible for this
debacle. Some of them have more than 6 level (one has 10 level).
2004 Oct 11
3
logistic regression
Hello,
I have a problem concerning logistic regressions. When I add a quadratic
term to my linear model, I cannot draw the line through my scatterplot
anymore, which is no problem without the quadratic term.
In this example my binary response variable is "incidence", the explanatory
variable is "sun":
> model0<-glm(incidence~1,binomial)
>
2011 Nov 20
3
logistic regression by glm
HI
I use glm in R to do logistic regression. and treat both response and
predictor as factor
In my first try:
*******************************************************************************
Call:
glm(formula = as.factor(diagnostic) ~ as.factor(7161521) +
as.factor(2281517), family = binomial())
Deviance Residuals:
Min 1Q Median 3Q Max
-1.5370 -1.0431 -0.9416 1.3065 1.4331
Coefficients:
2011 Oct 25
1
Glmnet Logistic Variable Questions
We are workin on building a logistic regression using
1. We are doing a logistic regression with binary outcome variable
using a set of predictors that include 8 continuous and 8 category
predictors
2. We are trying to implement interaction between two variables
(continuous and category or just continuous)
The dataset is 200,000 rows and we are using glmnet, how can we model
those two points ?
2011 May 05
7
Draw a nomogram after glm
Hi all R users
I did a logistic regression with my binary variable Y (0/1) and 2
explanatory variables.
Now I try to draw my nomogram with predictive value. I visited the help of R
but I have problem to understand well the example. When I use glm fonction,
I have a problem, thus I use lrm. My code is:
modele<-lrm(Y~L+P,data=donnee)
fun<- function(x) plogis(x-modele$coef[1]+modele$coef[2])