search for: deviances

Displaying 20 results from an estimated 1096 matches for "deviances".

Did you mean: deviance
2006 Apr 23
1
lme: null deviance, deviance due to the random effects, residual deviance
A maybe trivial and stupid question: In the case of a lm or glm fit, it is quite informative (to me) to have a look to the null deviance and the residual deviance of a model. This is generally provided in the print method or the summary, eg: Null Deviance: 658.8 Residual Deviance: 507.3 and (a bit simpled minded) I like to think that the proportion of deviance 'explained' by the
2011 Nov 10
1
Sum of the deviance explained by each term in a gam model does not equal to the deviance explained by the full model.
Dear R users, I read your methods of extracting the variance explained by each predictor in different places. My question is: using the method you suggested, the sum of the deviance explained by all terms is not equal to the deviance explained by the full model. Could you tell me what caused such problem? > set.seed(0) > n<-400 > x1 <- runif(n, 0, 1) > ## to see problem
2010 Jun 02
1
Problems using gamlss to model zero-inflated and overdispersed count data: "the global deviance is increasing"
Dear all, I am using gamlss (Package gamlss version 4.0-0, R version 2.10.1, Windows XP Service Pack 3 on a HP EliteBook) to relate bird counts to habit variables. However, most models fail because “the global deviance is increasing” and I am not sure what causes this behaviour. The dataset consists of counts of birds (duck) and 5 habit variables measured in the field (n= 182). The dependent
2010 Jun 04
0
MLG con Binomial Negativa
Estimados, Les consulto por el siguiente problema. Variable respuesta: Número de picadas de alimentación de los insectos (picadas) Factores: 1) Especie de insecto (especie), 2) Estado de desarrollo de los insectos (edad) y 3) Estado de desarrollo de frutos (estado). Modelo: MLG con distribución binomial negativa como componente aleatorio. Estrategia de selección de modelo: Evaluación del cambio
2009 May 27
1
Deviance explined in GAMM, library mgcv
...-users, To obtain the percentage of deviance explained when fitting a gam model using the mgcv library is straightforward: summary(object.gam) $dev.expl or alternatively, using the deviance (deviance(object.gam)) of the null and the fitted models, and then using 1 minus the quotient of deviances. However, when a gamm (generalizad aditive mixed model) is fitted, the deviance is not displayed, and only the logLik of the underlying lme model can be derived (logLik(objetct.gamm$lme)), which is not enough to derive the percentage deviance explained because the logLik for the saturated mod...
2007 Oct 08
2
variance explained by each term in a GAM
Hello fellow R's, I do apologize if this is a basic question. I'm doing some GAMs using the mgcv package, and I am wondering what is the most appropriate way to determine how much of the variability in the dependent variable is explained by each term in the model. The information provided by summary.gam() relates to the significance of each term (F, p-value) and to the
2005 Jul 08
1
explained deviance in multinom
...weights=total, data=mydata) null.model<- multinom(cbind(factor1, factor2 ,., factor5) ~ +1, weights=total, data=mydata) Then I calculated pseudoR^2 = 1 - full.model$deviance / null.model$deviance I'm obtaining very low values for pseudoR^2 (there is not much difference between the deviances of the two models). full.model fits (graphically) very well to the data , so I think that the problem is in the null.model (maybe it is not well defined) or with the calculus of pseudoR^2. Can someone please give me some suggestions about this? Thanks in advance alex
2009 Feb 16
1
Overdispersion with binomial distribution
I am attempting to run a glm with a binomial model to analyze proportion data. I have been following Crawley's book closely and am wondering if there is an accepted standard for how much is too much overdispersion? (e.g. change in AIC has an accepted standard of 2). In the example, he fits several models, binomial and quasibinomial and then accepts the quasibinomial. The output for residual
2005 Dec 14
3
Fitting binomial lmer-model, high deviance and low logLik
...z value Pr(>|z|) (Intercept) -2.55690 0.98895 -2.58548 0.009724 ** roefoxratio 0.50968 0.59810 0.85216 0.394127 The tolerance value in this model represent 0.1051 on my machine. Does anyone have an advice how to handle such problems? I find the tolerance needed to achieve reasonable deviances rather high, and makes me not too confident about the estimates and the model. Using the other methods, ("Laplace" or "AGQ") did not help. My system is windows 2000, > version _ platform i386-pc-mingw32 arch i386 os mingw32 system i386, mingw32 statu...
2011 Apr 08
1
multinom() residual deviance
...are with the saturated model. For me as a beginner, this sounds like an important warning, however, I don't know what the warning exactly means and hence do not know what the difference between the residual deviance of the former (binary) and the latter (multinomial) model is. (I need the deviances to calculate some of the pseudo R-squares with function pR2(), package "pscl".) Could you give good advice? Thanks *S* -- Sascha Vieweg, saschaview at gmail.com
2011 Mar 11
0
variance explained by each term in a GAM
Picking up an ancient thread (from Oct 2007), I have a somewhat more complex problem than given in Simon Wood's example below. My full model has more than two smooths as well as factor variables as in this simplified example: b <- gam(y~fv1+s(x1)+s(x2)+s(x3)) Judging from Simon's example, my guess is to fit reduced models to get components of deviance as follows: b1 <-
2001 Feb 15
2
deviance vs entropy
Hello, The question looks like simple. It's probably even stupid. But I spent several hours searching Internet, downloaded tons of papers, where deviance is mentioned and... And haven't found an answer. Well, it is clear for me the using of entropy when I split some node of a classification tree. The sense is clear, because entropy is an old good measure of how uniform is distribution.
2007 Apr 03
1
Calculating DIC from MCMC output
Greetings all, I'm a newcomer to Bayesian stats, and I'm trying to calculate the Deviance Information Criterion "by hand" from some MCMC output. However, having consulted several sources, I am left confused as to the exact terms to use. The most common formula can be written as DIC = 2*Mean(Deviance over the whole sampled posterior distribution) - Deviance(Mean
2006 Nov 13
3
Profile confidence intervals and LR chi-square test
System: R 2.3.1 on Windows XP machine. I am building a logistic regression model for a sample of 100 cases in dataframe "d", in which there are 3 binary covariates: x1, x2 and x3. ---------------- > summary(d) y x1 x2 x3 0:54 0:50 0:64 0:78 1:46 1:50 1:36 1:22 > fit <- glm(y ~ x1 + x2 + x3, data=d, family=binomial(link=logit)) >
2011 Jun 13
1
glm with binomial errors - problem with overdispersion
Dear all, I am new to R and my question may be trivial to you... I am doing a GLM with binomial errors to compare proportions of species in different categories of seed sizes (4 categories) between 2 sites. In the model summary the residual deviance is much higher than the degree of freedom (Residual deviance: 153.74 on 4 degrees of freedom) and even after correcting for overdispersion by
2008 Nov 12
1
Understanding glm family documentation: dev.resids
Hi all Consider the family function, as used by glm. The help page says the value of the family object is a list, one element of which is the following: dev.resids function giving the deviance residuals as a function of (y, mu, wt). But reading any of the family functions (eg poisson) shows that dev.resids is a function that computes the *square* of the deviance residuals (at least, by
2012 Jan 16
1
GAM without intercept reports a huge deviance
Hi all, I constructed a GAM model with a linear term and two smooth terms, all of them statistically significant but the intercept was not significant. The adjusted r2 of this model is 0.572 and the deviance 65.3. I decided to run the model again without intercept, so I used in R the following instruction: regression= gam(dependent~ +linear_independent +s(smooth_independent_1)
2006 Jan 14
2
initialize expression in 'quasi' (PR#8486)
This is not so much a bug as an infelicity in the code that can easily be fixed. The initialize expression in the quasi family function is, (uniformly for all links and all variance functions): initialize <- expression({ n <- rep.int(1, nobs) mustart <- y + 0.1 * (y == 0) }) This is inappropriate (and often fails) for variance function "mu(1-mu)".
2006 Mar 16
2
DIfference between weights options in lm GLm and gls.
Dear R-List users, Can anyone explain exactly the difference between Weights options in lm glm and gls? I try the following codes, but the results are different. > lm1 Call: lm(formula = y ~ x) Coefficients: (Intercept) x 0.1183 7.3075 > lm2 Call: lm(formula = y ~ x, weights = W) Coefficients: (Intercept) x 0.04193 7.30660 > lm3 Call:
2002 Jan 04
1
glm deviance question
I am comparing the Splus and R fits of a simple glm. In the following, foo is generated from rbinom with size = 20 p = 0.5. The coefficients (and SE's0 of the fitted models are the same, but the estimated deviances are quite different. Could someone please tell me why they are so different? I am using R version 1.3.1 and Splus 2000 release 3 on windows 2000. ++++++++++++++++++++++ foo <- c(9, 4, 10, 7, 11, 13, 8, 6, 11, 14, 11, 10, 7, 9, 13, 7, 9, 6, 10, 10) foo.glm <- glm(cbind(fo...