Displaying 20 results from an estimated 10000 matches similar to: "R Codes for fitting Logistic Regression for Multivaraite Data"
2009 Sep 25
1
Logistic Regression for Multinomial Data using R
Hi
I want to do logistic regression for multinomial data.
How can I do it in R?
Thanks a lot
Nimal Fernando
[[alternative HTML version deleted]]
2010 Jun 06
2
fitting multinomial logistic regression
Sir,
I want to fit a multinomial logistic regression in R.I think mlogit() is the
function for doing this. mlogit () is in packege globaltest.But, I can not
install this package. I use the following:
install.packages("globaltest")
Can you help me?
Regards,
Suman Dhara
[[alternative HTML version deleted]]
2009 Aug 19
3
Fitting a logistic regression
Hello,
I have this data:
Time AMP
0 0.2000000
10 0.1958350
20 0.2914560
40 0.6763628
60 0.8494534
90 0.9874526
120 1.0477692
where AMP is the concentration of this metabolite with time. If you plot
the data, you can see that it could be fitted using a logistic
regression. For this purpose, I used this code:
AMP.nls <- nls(AMP~SSlogis(Time,Asym, xmid, scal), data
2011 Jan 03
1
Logistic Regression Fitting with EM-Algorithm
Hi all,
is there any package which can do an EM algorithm fitting of
logistic regression coefficients given only the explanatory
variables? I tried to realize this using the Design package,
but I didn't find a way.
Thanks a lot & Kind regards
Robin Aly
2009 Nov 16
2
fitting a logistic regression with mixed type of variables
Hi,
I am trying to fit a logistic regression using glm, but my explanatory
variables are of mixed type: some are numeric, some are ordinal, some are
categorical, say
If x1 is numeric, x2 is ordinal, x3 is categorical, is the following formula
OK?
*model <- glm(y~x1+x2+x3, family=binomial(link="logit"), na.action=na.pass)*
*
*
*Thanks,*
*
*
*-Jack*
[[alternative HTML version
2012 Nov 09
1
Logistic curve fitting with y values >1 (R version 2.15.2, OS is OS X 10.6.8)
Hello,
I'm trying to fit a logistic curve to data but I'm having a hard time
discovering how. Every tutorial I've come across either assumes the
logistic curve has 0<y<1 or assumes I have multiple categories of data
I simply have two vectors, a and b, of equal length with no missing
data, and I suspect they follow a logistic curve.
The vectors are
a<-c(39609, 39643,
2008 Nov 08
3
Fitting a modified logistic with glm?
Hi all,
Where f(x) is a logistic function, I have data that follow:
g(x) = f(x)*.5 + .5
How would you suggest I modify the standard glm(..., family='binomial')
function to fit this? Here's an example of a clearly ill-advised attempt to
simply use the standard glm(..., family='binomial') approach:
########
# First generate some data
########
#define the scale and location of
2004 Nov 30
2
Package for multivariate binary logistic regression?
I am trying to find out if someone has implemented a (McFadden-type) multivariate
binary logistic regresssion package for R? From what I can tell, this is not available for R.
Thank you,
Lynne Baker
[[alternative HTML version deleted]]
2011 Feb 01
4
Fitting ELISA measurements "unknowns" to 4 parameter logistic model
Hello,
I am trying to fit my Elisa results (absorbance readings) to a standard
curve. To create the standard curve model, I performed a 4-parameter
logistic fit using the 'drc' package (ExpectedConc~Absorbance). This gave me
the following:
> FourP
A 'drc' model.
Call:
drm(formula = Response ~ Expected, data = SC, fct = LL.4())
Coefficients:
b:(Intercept) c:(Intercept)
2010 Dec 08
3
Confidence Intervals for Odds Ratios in multivariate logistic regression
Hi all,
I am trying to fit a logistic regression for a bivariate response using five
independent variables in a stepwise procedure. My outputs look okay but does
any one know (or is there any literature on) how the confidence intervals
are calculated for the reported odds ratios..?
Thanks!
[[alternative HTML version deleted]]
2004 Oct 29
3
missing values in logistic regression
Dear R help list,
I am trying to do a logistic regression
where I have a categorical response variable Y
and two numerical predictors X1 and X2. There
are quite a lot of missing values for predictor X2.
eg.,
Y X1 X2
red 0.6 0.2 *
red 0.5 0.2 *
red 0.5 NA
red 0.5 NA
green 0.2 0.1 *
green 0.1 NA
green 0.1 NA
green 0.05 0.05 *
I am wondering can I combine X1 and
2023 Aug 01
2
Plotting Fitted vs Observed Values in Logistic Regression Model
Dear friends,
I hope this email finds you all well. This is the dataset I am working
with:
dput(random_mod12_data2)
structure(list(Index = c(1L, 5L, 11L, 3L, 2L, 8L, 9L, 4L), x = c(5,
13, 25, 9, 7, 19, 21, 11), n = c(500, 500, 500, 500, 500, 500,
500, 500), r = c(100, 211, 391, 147, 122, 310, 343, 176), ratio = c(0.2,
0.422, 0.782, 0.294, 0.244, 0.62, 0.686, 0.352)), row.names = c(NA,
-8L),
2012 May 02
2
Problem with 'nls' fitting logistic model (5PL)
Dear R-Helpers,
I'm working with immunoassay data and 5PL logistic model. I wanted to
experiment with different forms of weighting and parameter selection,
which is not possible in instrument software, so I turned to R.
I am using R 2.14.2 under Win7 64bit, and the 'nls' library to fit the
model - I started with the same model and weighting type (1/y) as in the
instrument to see
2009 Nov 09
1
Percentage effects in logistic regression
Dear ALL,
I'm trying to figure out what the percentage effects are in a logistic
regression. To be more clear, I'm not interested in the effect on y of a
1-unit increase in x, but on the percentage effect on y of a 1% increase in
x (in economics this is also often called an "elasticity").
For example, if my independent variables are in logs, the betas can be
directly
2009 Jun 30
1
fitting in logistic model
I would like to know how R computes the probability of an event in a
logistic model (P(y=1)) from the score s, linear combination of x and
beta.
I noticed that there are differences (small, less than e-16) between the
fitting values automatically computed in the glm procedure by R, and the
values "manually" computed by me applying the reverse formula
p=e^s/(1+e^s); moreover I noticed
2010 Aug 19
1
logistic regression tree
hello everyone,
i sampled 100 stands at 20 restoration sites and presence of 3 different
invasive plant species.
i came across logistic regression trees and wonder if this is suited for my
purpose - predicting presence of these problematic invasive plant species
(one by one) by a set of recorded ecological / geographical parameters.
i'd be glad if someone would comment on applying this
2007 Apr 26
3
Reduced Error Logistic Regression, and R?
This news item in a data mining newsletter makes various claims for a technique called "Reduced Error Logistic Regression": http://www.kdnuggets.com/news/2007/n08/12i.html
In brief, are these (ambitious) claims justified and if so, has this technique been implemented in R (or does anyone have any plans to do so)?
Tim C
2008 Aug 17
0
Error fitting overdispersed logistic regression: package dispmod
Hi all,
First, a quick thank you for R; it's amazing.
I am trying to fit models for a count dataset following the overdispersed logisitic regression approach outlined in Baggerly et al. (BMC Bioinformatics, 5:144; Annotated R code is given at the end of the paper) but R is returning an error with the data below. Any help in understanding or overcoming this obstacle is appreciated.
2006 Jun 20
1
Bayesian logistic regression?
Hi all.
Are there any R functions around that do quick logistic regression with
a Gaussian prior distribution on the coefficients? I just want
posterior mode, not MCMC. (I'm using it as a step within an iterative
imputation algorithm.) This isn't hard to do: each step of a glm
iteration simply linearizes the derivative of the log-likelihood, and,
at this point, essentially no
How to use PC1 of PCA and dim1 of MCA as a predictor in logistic regression model for data reduction
2011 Aug 17
4
How to use PC1 of PCA and dim1 of MCA as a predictor in logistic regression model for data reduction
Hi all,
I'm trying to do model reduction for logistic regression. I have 13
predictor (4 continuous variables and 9 binary variables). Using subject
matter knowledge, I selected 4 important variables. Regarding the rest 9
variables, I tried to perform data reduction by principal component
analysis (PCA). However, 8 of 9 variables were binary and only one
continuous. I transformed the data by