similar to: possible tweaking of family()$simulate?

Displaying 20 results from an estimated 10000 matches similar to: "possible tweaking of family()$simulate?"

2012 Feb 21
1
prior.weights and weights()
I'm wondering whether anyone has any insight into why the 'simulate' methods for the built-in glm() families (binomial, Poisson, Gamma ...) extract the prior weights using object$prior.weights rather than weights(object,"prior") ? At first I thought this was so that things work correctly when e.g. subset= and na.action=na.exclude are used. However, the current versions of
2009 Feb 12
3
proposed simulate.glm method
I have found the "simulate" method (incorporated in some packages) very handy. As far as I can tell the only class for which simulate is actually implemented in base R is lm ... this is actually a little dangerous for a naive user who might be tempted to try simulate(X) where X is a glm fit instead, because it defaults to simulate.lm (since glm inherits from the lm class), and the
2007 Aug 02
1
simulate() and glm fits
Dear All, I have been trying to simulate data from a fitted glm using the simulate() function (version details at the bottom). This works for lm() fits and even for lmer() fits (in lme4). However, for glm() fits its output does not make sense to me -- am I missing something or is this a bug? Consider the following count data, modelled as gaussian, poisson and binomial responses: counts
2006 Mar 08
1
power and sample size for a GLM with Poisson response variable
Craig, Thanks for your follow-up note on using the asypow package. My problem was not only constructing the "constraints" vector but, for my particular situation (Poisson regression, two groups, sample sizes of (1081,3180), I get very different results using asypow package compared to my other (home grown) approaches. library(asypow) pois.mean<-c(0.0065,0.0003) info.pois <-
2016 Jun 02
0
[RfC] Family dispersion
Hi, I'd like to hear your opinion about the following proposal to make the computation of dispersion in GLMs more flexible. Dispersion is used in summary.glm; the relevant code chunk with the dispersion calculation is listed below (from glm.R): summary.glm <- function(object, dispersion = NULL, correlation = FALSE, symbolic.cor = FALSE, ...) { est.disp <- FALSE df.r <-
2018 Jun 17
1
aic() component in GLM-family objects
FWIW p. 206 of the White Book gives the following for names(binomial()): family, names, link, inverse, deriv, initialize, variance, deviance, weight. So $aic wasn't there In The Beginning. I haven't done any more archaeology to try to figure out when/by whom it was first introduced ... Section 6.3.3, on extending families, doesn't give any other relevant info. A patch for
2011 Feb 28
1
mixture models/latent class regression comparison
Dear list, I have been comparing the outputs of two packages for latent class regression, namely 'flexmix', and 'mmlcr'. What I have noticed is that the flexmix package appears to come up with a much better fit than the mmlcr package (based on logLik, AIC, BIC, and visual inspection). Has anyone else observed such behaviour? Has anyone else been successful in using the mmlcr
2019 Dec 27
0
"simulate" does not include variability in parameter estimation
On 26/12/2019 11:14 p.m., Spencer Graves wrote: > Hello, All: > > > ????? The default "simulate" method for lm and glm seems to ignore the > sampling variance of the parameter estimates;? see the trivial lm and > glm examples below.? Both these examples estimate a mean with formula = > x~1.? In both cases, the variance of the estimated mean is 1. That's how
2018 Jun 04
0
aic() component in GLM-family objects
>>>>> Ben Bolker >>>>> on Sun, 3 Jun 2018 17:33:18 -0400 writes: > Is it generally known/has it been previously discussed here that the > $aic() component in GLM-family objects (e.g. results of binomial(), > poisson(), etc.) does not as implemented actually return the AIC, but > rather -2*log-likelihood + 2*(model_has_scale_parameter)
2019 Dec 27
1
"simulate" does not include variability in parameter estimation
On 2019-12-27 04:34, Duncan Murdoch wrote: > On 26/12/2019 11:14 p.m., Spencer Graves wrote: >> Hello, All: >> >> >> ? ????? The default "simulate" method for lm and glm seems to ignore the >> sampling variance of the parameter estimates;? see the trivial lm and >> glm examples below.? Both these examples estimate a mean with formula = >>
2006 Apr 10
1
Generic code for simulating from a distribution.
Hello all, I have the code below to simulate samples of certain size from a particular distribution (here,beta distribution) and compute some statistics for the samples. betasim2<-function(nsim,n,alpha,beta) { sim<-matrix(rbeta(nsim*n,alpha,beta),ncol=n) xmean<-apply(sim,1,mean) xvar<-apply(sim,1,var) xmedian<-apply(sim,1,median)
2007 Oct 03
2
Speeding up simulation of mean nearest neighbor distances
I've written the function below to simulate the mean 1st through nth nearest neighbor distances for a random spatial pattern using the functions nndist() and runifpoint() from spatsat. It works, but runs relatively slowly - would appreciate suggestions on how to speed up this function. Thanks. --Dale library(spatstat) sim.nth.mdist <- function(nth,nsim) { D <- matrix(ncol=nth,
2019 Dec 27
2
"simulate" does not include variability in parameter estimation
Hello, All: ????? The default "simulate" method for lm and glm seems to ignore the sampling variance of the parameter estimates;? see the trivial lm and glm examples below.? Both these examples estimate a mean with formula = x~1.? In both cases, the variance of the estimated mean is 1. ??? ??????? * In the lm example with x0 = c(-1, 1), var(x0) = 2, and
2001 Dec 19
1
Pearson residuals in quasi family
Hi all, This is a very silly question or something escapes me: Let obj a simple gam poisson model. Let >obj<-gam(....,family=poisson) >obj1<-update(obj, family=quasi(link="log", var="mu")) >From summary.glm(obj1) the dispersion parameter is estimated 1.165; In fact it is: > (predict(obj1, se.fit=T)$se.fit[1:5]/predict(obj, se.fit=T)$se.fit[1:5])^2 4
2006 Oct 08
1
Simulate p-value in lme4
Dear r-helpers, Spencer Graves and Manual Morales proposed the following methods to simulate p-values in lme4: ************preliminary************ require(lme4) require(MASS) summary(glm(y ~ lbase*trt + lage + V4, family = poisson, data = epil), cor = FALSE) epil2 <- epil[epil$period == 1, ] epil2["period"] <- rep(0, 59); epil2["y"] <- epil2["base"]
2023 Oct 31
1
weights vs. offset (negative binomial regression)
[Please keep r-help in the cc: list] I don't quite know how to interpret the difference between specifying effort as an offset vs. as weights; I would have to spend more time thinking about it/working through it than I have available at the moment. I don't know that specifying effort as weights is *wrong*, but I don't know that it's right or what it is doing: if I were
2012 Sep 25
1
appropriate test in glm when the family is Gamma
Dear R users, Which test is most appropriate in glm when the family is Gamma? In the help page of anova.glm, I found the following ?For models with known dispersion (e.g., binomial and Poisson fits) the chi-squared test is most appropriate, and for those with dispersion estimated by moments (e.g., gaussian, quasibinomial and quasipoisson fits) the F test is most appropriate.? My questions :
2006 Aug 17
1
Simulate p-value in lme4
Dear list, This is more of a stats question than an R question per se. First, I realize there has been a lot of discussion about the problems with estimating P-values from F-ratios for mixed-effects models in lme4. Using mcmcsamp() seems like a great alternative for evaluating the significance of individual coefficients, but not for groups of coefficients as might occur in an experimental design
2012 Nov 23
1
Spatstat: Mark correlation function
I normally use the following code to create a figure displaying the mark correlation function for the point pattern process "A": M<-markcorr(A) plot(M) I have now started to use the following code to perform 1000 Monte Carlo simulations of Complete Spatial Randomness (CSR). It is a Monte Carlo test based on envelopes of the Mark correlation function obtained from simulated point
2006 Jul 13
1
TR: Latent Class Analysis
_____ De : Pousset [mailto:maud.pousset@noos.fr] Envoyé : mardi 4 juillet 2006 18:38 À : 'r-help@stat.math.ethz.ch' Objet : Latent Class Analysis Hello everybody, I am working on latent class analysis and have already used the ‘R’ function « lca » (in the e1071 package). I ‘ve got interesting results but I can’t simply find out the methodology used by this routine : 1) What