similar to: Multicollinearity with brglm?

Displaying 20 results from an estimated 300 matches similar to: "Multicollinearity with brglm?"

2013 Feb 27
1
Separation issue in binary response models - glm, brglm, logistf
Dear all, I am encountering some issues with my data and need some help. I am trying to run glm analysis with a presence/absence variable as response variable and several explanatory variable (time, location, presence/absence data, abundance data). First I tried to use the glm() function, however I was having 2 warnings concerning glm.fit () : # 1: glm.fit: algorithm did not converge # 2:
2012 Jul 11
1
Help needed to tackle multicollinearity problem in count data with the help of R
Dear everyone, I'm student of Masters in Statistics (Actuarial) from Central University of Rajasthan, India. I am doing a major project work as a part of the degree. My major project deals with fitting a glm model for the data of car insurance. I'm facing the problem of multicollinearity for this data which is visible by the plotting of data. But I'm not able to test it. In the case
2009 Aug 16
1
How to deal with multicollinearity in mixed models (with lmer)?
Dear R users, I have a problem with multicollinearity in mixed models and I am using lmer in package lme4. From previous mailing list, I learn of a reply "http://www.mail-archive.com/r-help at stat.math.ethz.ch/msg38537.html" which states that if not for interpretation but just for prediction, multicollinearity does not matter much. However, I am using mixed model to interpret something,
2009 Mar 26
1
Extreme AIC in glm(), perfect separation, svm() tuning
Dear List, With regard to the question I previously raised, here is the result I obtained right now, brglm() does help, but there are two situations: 1) Classifiers with extremely high AIC (over 200), no perfect separation, coefficients converge. in this case, using brglm() does help! It stabilize the AIC, and the classification power is better. Code and output: (need to install package:
2016 Apr 15
1
Multicollinearity & Endogeniety : PLSPM
Hi I need a bit of guidance on tests and methods to look for multicollinearity and Endogeniety while using plspm Pl help ------------------ T&R ... Deva [[alternative HTML version deleted]]
2007 Jul 18
0
multicollinearity in nlme models
I am working on a nlme model that has multiple fixed effects (linear and nonlinear) with a nonlinear (asymptotic) random effect. asymporig<-function(x,th1,th2)th1*(1-exp(-exp(th2)*x)) asymporigb<-function(x,th1b,th2b)th1b*(1-exp(-exp(th2b)*x)) mod.vol.nlme<-nlme(fa20~(ah*habdiv+ads*ds+ads2*ds2+at*trout)+asymporig(da.p,th1,th2)+ asymporigb(vol,th1b,th2b),
2006 Oct 23
0
Methods of addressing multicollinearity in multiple linear regression with R
In searching the R help archives I find a number of postings in April of 2005, but nothing since then. If readers are aware of more recent contributions addressing the problems arising from multicollinearity (such as with the bootstrap, jackknife, or other techniques) I would appreciate a reference. Thank you, Ben Fairbank [[alternative HTML version deleted]]
2011 Oct 13
1
binomial GLM quasi separation
Hi all, I have run a (glm) analysis where the dependent variable is the gender (family=binomial) and the predictors are percentages. I get a warning saying "fitted probabilities numerically 0 or 1 occurred" that is indicating that quasi-separation or separation is occurring. This makes sense given that one of these predictors have a very influential effect that is depending on a
2004 Aug 16
2
mutlicollinearity and MM-regression
Dear R users, Usually the variance-inflation factor, which is based on R^2, is used as a measure for multicollinearity. But, in contrast to OLS regression there is no robust R^2 available for MM-regressions in R. Do you know if an equivalent or an alternative nmeasure of multicollinearity is available for MM-regression in R? With best regards, Carsten Colombier Dr. Carsten Colombier Economist
2011 Dec 29
2
3d plotting alternatives. I like persp, but regret the lack of plotmath.
I have been making simple functions to display regressions in a new package called "rockchalk". For 3d illustrations, my functions use persp, and I've grown to like working with it. As an example of the kind of things I like to do, you might consult my lecture on multicollinearity, which is by far the most detailed illustration I've prepared.
2013 Nov 21
1
Regression model
Hi, I'm trying to fit regression model, but there is something wrong with it. The dataset contains 85 observations for 85 students.Those observations are counts of several actions, and dependent variable is final score. More precisely, I have 5 IV and one DV. I'm trying to build regression model to check whether those variables can predict the final score. I'm attaching output of
2008 Dec 15
5
OT: (quasi-?) separation in a logistic GLM
Dear List, Apologies for this off-topic post but it is R-related in the sense that I am trying to understand what R is telling me with the data to hand. ROC curves have recently been used to determine a dissimilarity threshold for identifying whether two samples are from the same "type" or not. Given the bashing that ROC curves get whenever anyone asks about them on this list (and
2009 Mar 22
0
multicollinearity
Dear R users, I'm analysing some data, and I'm using an lme function. I have a problem with choosing the right order for three of my explanatory variables, which shows collinearity. Is there any rules to make the decision?(r.squared?) Or it's better if I choose the order, that I think gives me more information about the data? Say x1 is the variable with the highest r.squared, x3
2011 Apr 18
1
regression and lmer
Dear all,  I hope this is the right place to ask this question. I am reviewing a research where the analyst(s) are using a linear regression model. The dependent variable (DV) is a continuous measure. The independent variables (IVs) are a mixture of linear and categorical variables. The author investigates whether performance (DV - continuous linear) is a function of age (continuous IV1 -
2005 Apr 11
2
dealing with multicollinearity
I have a linear model y~x1+x2 of some data where the coefficient for x1 is higher than I would have expected from theory (0.7 vs 0.88) I wondered whether this would be an artifact due to x1 and x2 being correlated despite that the variance inflation factor is not too high (1.065): I used perturbation analysis to evaluate collinearity library(perturb)
2010 Jan 20
2
simulation of binary data
Hi, could someone help me with dilemma on the simulation of logistic regressiondata with multicollinearity effect and high leverage point.. Thank you [[alternative HTML version deleted]]
2004 Jun 11
1
Regression query : steps for model building
Hi I have a set of data with both quantitative and categorical predictors. After scaling of response variable, i looked for multicollinearity (VIF values) among the predictors and removed the predictors who were hinding some of the other significant predictors. I'm curious to know whether the predictors (who are not significant) while doing simple 'lm' will be involved in
2012 Mar 07
2
Problems with generalized linear model (glm) coefficients.
Hello to everyone. I´m writing you because I´m feeling a bit frustrated with my work. My work consists in finding the relation between the amount of fires and the weather, so, my response variable is the amount of fires in a fire season and the explanatory variables are the temperature, the amount of precipitation and the some others…. my problem is this; I keep getting the wrong sign in the
2012 Mar 05
1
Nagelkerke R2
Dear R community. I´m working with a generalized linear model which the response variable is a categorical one and the predictive variables are weather conditions. I have 250 different places where I need to fit the model. In some of these places I have strong correlations between some of the variables so I need to deal with this problem. I found a work similar than mine where they use tha
2012 Feb 10
1
Trust in a glm.nb model results with an itereation limit reached
Hello to everyone. I'm fitting a glm.nb model to a count data. I'm using about 8 predictive variables. Once a run the script I do get a result but it tells me that the iteration limit has been reached. So, can i trust the results given by the model? Could it be a multicollinearity problem? Thank you for taking the time to help me. Greetings Lucas. -- View this message in context: