similar to: Changing design matrix in glm

Displaying 20 results from an estimated 70000 matches similar to: "Changing design matrix in glm"

2010 Jun 30
1
parameterization of glm nested design
Dear R community, I am new to R, a reforming SAS user :) I am running R 2.10.1 on a Windows XP machine. I would like to write linear functions of my coefficient parameter estimates from a glm, but am having a difficult time understanding the parameterization R uses. In the toy example below I am running a glm on binomial data, with clones and lines within clones as fixed effects, each with 6
2008 Mar 17
1
Std errors in glm models w/ and w/o intercept
I am doing a reanalysis of results that have previously been published. My hope was to demonstrate the value of adoption of more modern regression methods in preference to the traditional approach of univariate stratification. I have encountered a puzzle regarding differences between I thought would be two equivalent analyses. Using a single factor, I compare poisson models with and without
2009 Nov 02
1
Interaction contrasts or posthoc test for glm (MASS) with ANOVA design
Dear R experts I am running a negative-binomial GLM (glm.nb) to test the null hypotheses that species 1 and 2 are equally abundant between site 1 and site2, and between each other. So, I have a 2x2 factorial design with factors Site (1,2) and Taxon (1,2). Since the Site:Taxon interaction is significant, I need to do the equivalent to a "post-hoc test" for ANOVA, however, the same tests
2005 Jul 27
1
Question on glm for Poisson distribution.
Good afternoon, I REALLY try to answer to my question as an autonomous student searching in the huge pile of papers on my desk and on the Internet but I can't find out the solution. Would you mind giving me some help? Please. ######################################### I'm trying to use glm with factors: > Pyr.1.glm<-glm(Pyrale~Trait,DataRav,family=poisson) If I have correctly
2009 Mar 17
2
bigglm() results different from glm()
Dear all, I am using the bigglm package to fit a few GLM's to a large dataset (3 million rows, 6 columns). While trying to fit a Poisson GLM I noticed that the coefficient estimates were very different from what I obtained when estimating the model on a smaller dataset using glm(), I wrote a very basic toy example to compare the results of bigglm() against a glm() call. Consider the
2000 Dec 18
3
problems with glm (PR#771)
R1.2.0 with Linux RH5.2 I do not believe that the problems below are new to 1.2 but I only cover this sort of thing once a year in my course and some of that happened to be last Friday so too late to report for 1.2. I see that one or two things that I was going to report have been corrected. I like the fact that interactions now show : instead of . Here is some output with comments inserted. R
2000 Dec 18
3
problems with glm (PR#771)
R1.2.0 with Linux RH5.2 I do not believe that the problems below are new to 1.2 but I only cover this sort of thing once a year in my course and some of that happened to be last Friday so too late to report for 1.2. I see that one or two things that I was going to report have been corrected. I like the fact that interactions now show : instead of . Here is some output with comments inserted. R
2008 Jul 07
1
GLM, LMER, GEE interpretation
Hi, my dependent variable is a proportion ("prob.bind"), and the independent variables are factors for group membership ("group") and a covariate ("capacity"). I am interested in the effects of group, capacity, and their interaction. Each subject is observed on all (4) levels of capacity (I use capacity as a covariate because the effect of this variable is normatively
2012 Sep 29
1
Unexpected behavior with weights in binomial glm()
Hi useRs, I'm experiencing something quite weird with glm() and weights, and maybe someone can explain what I'm doing wrong. I have a dataset where each row represents a single case, and I run glm(...,family="binomial") and get my coefficients. However, some of my cases have the exact same values for predictor variables, so I should be able to aggregate up my data frame and
2009 Apr 24
2
prediction intervals (alpha and beta) for model average estimates from binomial glm and model.avg (library=dRedging)
Hi all, I was wondering if there is a function out there, or someone has written code for making confidence intervals around model averaged predictions (y~á+âx). The model average estimates are from the dRedging library? It seems a common thing but I can't seem to find one via the search engines Examples of the models are: fit1 <- glm(y~ dbh, family = binomial, data = data) fit2 <-
2018 Feb 16
2
SE for all levels (including reference) of a factor atfer a GLM
Dear R-er, I try to get the standard error of fitted parameters for factors with a glm, even the reference one: a <- runif(100) b <- sample(x=c("0", "1", "2"), size=100, replace = TRUE) df <- data.frame(A=a, B=b, stringsAsFactors = FALSE) g <- glm(a ~ b, data=df) summary(g)$coefficients # I don't get SE for the reference factor, here 0:
2011 Nov 20
3
logistic regression by glm
HI I use glm in R to do logistic regression. and treat both response and predictor as factor In my first try: ******************************************************************************* Call: glm(formula = as.factor(diagnostic) ~ as.factor(7161521) + as.factor(2281517), family = binomial()) Deviance Residuals: Min 1Q Median 3Q Max -1.5370 -1.0431 -0.9416 1.3065 1.4331 Coefficients:
2008 Jan 05
2
Behavior of ordered factors in glm
I have a variable which is roughly age categories in decades. In the original data, it came in coded: > str(xxx) 'data.frame': 58271 obs. of 29 variables: $ issuecat : Factor w/ 5 levels "0 - 39","40 - 49",..: 1 1 1 1... snip I then defined issuecat as ordered: > xxx$issuecat<-as.ordered(xxx$issuecat) When I include issuecat in a glm model, the result
2018 Feb 16
0
SE for all levels (including reference) of a factor atfer a GLM
This is really a statistical issue. What do you think the Intercept term represents? See ?contrasts. Cheers, Bert Bert Gunter "The trouble with having an open mind is that people keep coming along and sticking things into it." -- Opus (aka Berkeley Breathed in his "Bloom County" comic strip ) On Thu, Feb 15, 2018 at 5:27 PM, Marc Girondot via R-help < r-help at
2017 Sep 13
3
vcov and survival
Dear Terry, Even the behaviour of lm() and glm() isn't entirely consistent. In both cases, singularity results in NA coefficients by default, and these are reported in the model summary and coefficient vector, but not in the coefficient covariance matrix: ---------------- > mod.lm <- lm(Employed ~ GNP + Population + I(GNP + Population), + data=longley) >
2009 Jul 03
2
bigglm() results different from glm()
Hi Sir, Thanks for making package available to us. I am facing few problems if you can give some hints: Problem-1: The model summary and residual deviance matched (in the mail below) but I didn't understand why AIC is still different. > AIC(m1) [1] 532965 > AIC(m1big_longer) [1] 101442.9 Problem-2: chunksize argument is there in bigglm but not in biglm, consequently,
2012 Nov 27
1
glm convergence warning
Hello, When I run the following glm model: modelresult=glm(CID~WS+SS+DV+DS, data=kimu, family=binomial) I get the following warning messages: 1: glm.fit: algorithm did not converge 2: glm.fit: fitted probabilities numerically 0 or 1 occurred What I am trying to do is model my response variable (CID: correct bird identification) as a function of the predictor variables weather state (WS), sea
2006 Mar 16
2
DIfference between weights options in lm GLm and gls.
Dear R-List users, Can anyone explain exactly the difference between Weights options in lm glm and gls? I try the following codes, but the results are different. > lm1 Call: lm(formula = y ~ x) Coefficients: (Intercept) x 0.1183 7.3075 > lm2 Call: lm(formula = y ~ x, weights = W) Coefficients: (Intercept) x 0.04193 7.30660 > lm3 Call:
2012 Feb 28
1
Interpreting the Results of GLM
Hi, I'm wondering if you can help me, this is a really simple query but I keep getting confused. I have run a GLM to see how boldness varies over time following a particular treatment. The results are as follows... Call: glm(formula = boldtwentyfour ~ treatment + boldcontrol) Deviance Residuals: Min 1Q Median 3Q Max -1.7577 -0.5469 0.0456 0.5515 1.5327
2008 Sep 03
2
ANCOVA/glm missing/ignored interaction combinations
Hi I am using R version 2.7.2. on a windows XP OS and have a question concerning an analysis of covariance with count data I am trying to do, I will give details of a scaled down version of the analysis (as I have more covariates and need to take account of over-dispersion etc etc) but as I am sure it is only a simple problem but I just can't see how to fix it. I have a data set with count