similar to: weights in glm for binomial model

Displaying 20 results from an estimated 50000 matches similar to: "weights in glm for binomial model"

2012 Sep 29
1
Unexpected behavior with weights in binomial glm()
Hi useRs, I'm experiencing something quite weird with glm() and weights, and maybe someone can explain what I'm doing wrong. I have a dataset where each row represents a single case, and I run glm(...,family="binomial") and get my coefficients. However, some of my cases have the exact same values for predictor variables, so I should be able to aggregate up my data frame and
2010 Apr 16
2
Weights in binomial glm
I have some questions about the use of weights in binomial glm as I am not getting the results I would expect. In my case the weights I have can be seen as 'replicate weights'; one respondent i in my dataset corresponds to w[i] persons in the population. From the documentation of the glm method, I understand that the weights can indeed be used for this: "For a binomial GLM prior
2005 Aug 08
1
Help with "non-integer #successes in a binomial glm"
Hi, I had a logit regression, but don't really know how to handle the "Warning message: non-integer #successes in a binomial glm! in: eval(expr, envir, enclos)" problem. I had the same logit regression without weights and it worked out without the warning, but I figured it makes more sense to add the weights. The weights sum up to one. Could anyone give me some hint? Thanks a lot!
2014 Nov 14
3
Cómo aplicar weights a las observaciones en un GLM binomial
Hola, espero ser clara en el mensaje ya que es la primera vez que recurro a este tipo de ayudas, explico mi duda: Tengo un dataset con 4505 observaciones en el que la variable dependiente son presencias (n=97 y clasificadas como 1) y ausencias (n=4408 y clasificadas como 0). Mi primer paso fue realizar un GLM con una muestra compensada de ausencias y presencias para la variable dependiente, es
2011 Feb 16
1
Saturated model in binomial glm
Hi all, Could somebody be so kind to explain to me what is the saturated model on which deviance and degrees of freedom are calculated when fitting a binomial glm? Everything makes sense if I fit the model using as response a vector of proportions or a two-column matrix. But when the response is a factor and counts are specified via the "weights" argument, I am kind of lost as far as
2007 Aug 14
1
glm(family=binomial) and lmer
Dear R users, I've notice that there are two ways to conduct a binomial GLM with binomial counts using R. The first way is outlined by Michael Crawley in his "Statistical Computing book" (p 520-521): >dose=c(1,3,10,30,100) >dead = c(2,10,40,96,98) >batch=c(100,90,98,100,100) >response = cbind(dead,batch-dead) >model1=glm(y~log(dose),binomial)
2009 Feb 07
1
Performing a glm binomial model
Hello R-users, My first consult is that I would like to tackle the relationship between a dependent variable and four environmental factors using a glm binomial model. The problem is that I want to test the interaction between factors too and I would like to learn some way to do it automatically. Now, my syntax is, for instance: E1<-glm(formula = (ENF/TOT)~VAR1+VAR2+VAR3+VAR4, family =
2005 Apr 11
1
glm family=binomial logistic sigmoid curve problem
I'm trying to plot an extrapolated logistic sigmoid curve using glm(..., family=binomial) as follows, but neither the fitted() points or the predict()ed curve are plotting correctly: > year <- c(2003+(6/12), 2004+(2/12), 2004+(10/12), 2005+(4/12)) > percent <- c(0.31, 0.43, 0.47, 0.50) > plot(year, percent, xlim=c(2003, 2007), ylim=c(0, 1)) > lm <- lm(percent ~ year)
2009 Jan 06
4
Apparant bug in binomial model in GLM (PR#13434)
Full_Name: S?ren Faurby Version: 2.4.1 and 2.7.2 OS: Submission from: (NULL) (192.38.46.92) There appear to be a bug in the estimation of significance in the binomial model in GLM. This bug apparently appears when the correlation between two variables is to strong. Such as this dummy example c(0,0,0,0,0,1,1,1,1,1)->a a->b m1<-glm(a~b, binomial) summary(m1) It is sufficient that all
2006 Apr 19
1
Trouble with glm() .... non-integer #successes in a binomial glm
Hi R-people: When I use the command to fit a model with an intercept, only: glm ( formula=haspdata ~ 1, data=dat, family=binomial, weights= dat$hy.wgt.s, subset=(dat$haspdat0!=3) ) I get the message: Warning message: non-integer #successes in a binomial glm! in: eval(expr, envir, enclos) Does anyone know what this means?? The data for this command is listed below. Thanks, Phil Smith CDC
2012 Dec 10
3
Warning message: In eval(expr, envir, enclos) : non-integer #successes in a binomial glm!
Hi there I'm trying to fit a logistic regression model to data that looks very similar to the data in the sample below. I don't understand why I'm getting this error; none of the data are proportional and the weights are numeric values. Should I be concerned about the warning about non-integer successes in my binomial glm? If I should be, how do I go about addressing it? I'm
2006 Apr 09
1
logistic regression model with non-integer weights
When fitting a logistic regression model using weights I get the following warning > data.model.w <- glm(ABN ~ TR, family=binomial(logit), weights=WEIGHT) Warning message: non-integer #successes in a binomial glm! in: eval(expr, envir, enclos) Details follow *** I have a binary dependent variable of abnormality ABN = T, F, T, T, F, F, F... and a continous predictor TR = 1.962752
2007 Mar 20
1
How does glm(family='binomial') deal with perfect sucess?
Hi all, Trying to understand the logistic regression performed by glm (i.e. when family='binomial'), and I'm curious to know how it treats perfect success. That is, lets say I have the following summary data x=c(1,2,3,4,5,6) y=c(0,.04,.26,.76,.94,1) w=c(100,100,100,100,100,100) where x is y is the probability of success at each value of x, calculated across w observations.
2010 Dec 11
2
Specifying Prior Weights in a GLM
Hello R folks, I have three questions. I am trying to run a logistic regression (binomial family) where the response variable is a proportion. According to R Documentation in "a binomial GLM prior weights are used to give the number of trials when the response is the proportion of successes." However when I run my code I get the following error message: Error in
2014 Nov 14
3
Cómo aplicar weights a las observaciones en un GLM binomial
Gracias por la ayuda Jose Luis. pero o no te he entendido bien o mi duda es tan sencilla que no me he explicado. SI yo tampoco he entendido mal tu explicación, mi problema es cómo obtengo ese "tus.pesos" para introducir, por ejemplo, en la función: library(survey) # objeto del diseño muestral ddatos <- svydesign(id=~1, weights =~ tus.pesos, data = tus.datos) # en caso de una reg
2023 Oct 31
1
weights vs. offset (negative binomial regression)
[Please keep r-help in the cc: list] I don't quite know how to interpret the difference between specifying effort as an offset vs. as weights; I would have to spend more time thinking about it/working through it than I have available at the moment. I don't know that specifying effort as weights is *wrong*, but I don't know that it's right or what it is doing: if I were
2012 Jun 24
1
MuMIn for GLM Negative Binomial Model
Hello I am not able to use the MuMIn package (version 1.7.7) for multimodel inference with a GLM Negative Binomial model (It does work when I use GLM Poisson). The GLM Negative Binomial gives the following error statement: Error in get.models(NBModel, subset = delta < 4) : object has no 'calls' attribute Here is the unsuccessful Negative Binomial code. > > BirdNegBin
2010 Dec 30
1
Different results in glm() probit model using vector vs. two-column matrix response
Hi - I am fitting a probit model using glm(), and the deviance and residual degrees of freedom are different depending on whether I use a binary response vector of length 80 or a two-column matrix response (10 rows) with the number of success and failures in each column. I would think that these would be just two different ways of specifying the same model, but this does not appear to be the case.
2002 Mar 01
1
glm with binomial errors in R and GLIM
Hi all, In my continuous transition of GLIM to R I try to make a glm with binomial errors. The data file have 3 vectors: h -> the factor that is ajusted (have 3 levels) d -> number of animais alive (the response) n -> total number of animals To test proportion of alive, make d/n. In GLIM: $yvar d$ $error binomial n$ $fit +h$ scale deviance = 25.730 (change = -9.138) at cycle 4
2009 Apr 24
2
prediction intervals (alpha and beta) for model average estimates from binomial glm and model.avg (library=dRedging)
Hi all, I was wondering if there is a function out there, or someone has written code for making confidence intervals around model averaged predictions (y~á+âx). The model average estimates are from the dRedging library? It seems a common thing but I can't seem to find one via the search engines Examples of the models are: fit1 <- glm(y~ dbh, family = binomial, data = data) fit2 <-