Displaying 20 results from an estimated 10000 matches similar to: "(no subject)"
2005 Jul 14
0
Pearson dispersion statistic
Thank you for your reply.
I am aware of the good reasons not to use the deviance estimate in
binomial, Poisson, and gamma families.
However, for the inverse Gaussian, the choice seems to me less clear
cut. So I just wanted to compare two different options.
I have used the dispersion parameter to compute the standardized
deviance residuals:
summary(model.gamma)$deviance.resid
2005 May 06
4
Change class factor to numeric
I am attempting to develop a multiple regression model using selected
model variables that should all be treated as numeric (mostly real)
values.
However, R considers one specific variable "mass" automatically to be of
class "factor", probably because "mass" consists of integer values that
are repeated.
I now want to force R to treat "mass" as a numeric
2005 Feb 24
2
Forward Stepwise regression based on partial F test
I am hoping to get some advise on the following:
I am looking for an automatic variable selection procedure to reduce the
number of potential predictor variables (~ 50) in a multiple regression
model.
I would be interested to use the forward stepwise regression using the
partial F test.
I have looked into possible R-functions but could not find this
particular approach.
There is a function
2009 Mar 02
2
Unrealistic dispersion parameter for quasibinomial
I am running a binomial glm with response variable the no of mites of two
species y->cbind(mitea,miteb) against two continuous variables (temperature
and predatory mites) - see below. My model shows overdispersion as the
residual deviance is 48.81 on 5 degrees of freedom. If I use quasibinomial
to account for overdispersion the dispersion parameter estimate is 2501139,
which seems
2002 Aug 22
2
Calculating dispersion in glm
Hi all,
How is dispersion calculated within the glm function in R ?
Cheers
_________________________________________________________________
-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-
r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html
Send "info", "help", or "[un]subscribe"
(in the "body",
2000 Apr 19
1
scale factors/overdispersion in GLM: possible bug?
I've been poking around with GLMs (on which I am *not* an expert) on
behalf of a student, particularly binomial (standard logit link) nested
models with overdispersion.
I have one possible bug to report (but I'm not confident enough to be
*sure* it's a bug); one comment on the general inconsistency that seems to
afflict the various functions for dealing with overdispersion in GLMs
2004 Mar 16
2
glm questions
Greetings, everybody. Can I ask some glm questions?
1. How do you find out -2*lnL(saturated model)?
In the output from glm, I find:
Null deviance: which I think is -2[lnL(null) - lnL(saturated)]
Residual deviance: -2[lnL(fitted) - lnL(saturated)]
The Null model is the one that includes the constant only (plus offset
if specified). Right?
I can use the Null and Residual deviance to
1998 Feb 03
2
glm(.) / summary.glm(.); [over]dispersion and returning AIC..
I have been implementing a proposal of Jim Lindsey for glm(.)
to return AIC values, and
print.glm(.) and print.summary.glm(.) printing them....
however:
>>>>> "Jim" == Jim Lindsey <jlindsey@luc.ac.be> writes:
Jim> The problem still remains of getting the correct AIC when the user
Jim> wants the scale parameter to be fixed. (The calculation should
2005 Apr 04
1
Object item extraction
Hello
I am able to extract partial regression coefficients from a fitted model
object "model", i.e.
model <- lm(var.sel.gkm, weights = count.gkm, data = DATA)
summary(model)
write.table(model$coef, file = "C:/coef_CO_gkm.txt", row.names = TRUE,
col.names = TRUE)
I was wondering if anyone could advise me how to extract other object
items such as std. error, t-values
2010 Aug 20
3
Deviance Residuals
Dear all,
I am running a logistic regression and this is the output:
glm(formula = educationUniv ~ brncntr, family = binomial)
Deviance Residuals:
Min 1Q Median 3Q Max # ???? ????? ?? ????????
-0.8825 -0.7684 -0.7684 1.5044 1.6516
Coefficients:
Estimate Std. Error z value Pr(>|z|)
(Intercept) -1.06869 0.01155 -92.487 <2e-16 ***
brncntrNo
2005 Jun 16
1
mu^2(1-mu)^2 variance function for GLM
Dear list,
I'm trying to mimic the analysis of Wedderburn (1974) as cited by
McCullagh and Nelder (1989) on p.328-332. This is the leaf-blotch on
barley example, and the data is available in the `faraway' package.
Wedderburn suggested using the variance function mu^2(1-mu)^2. This
variance function isn't readily available in R's `quasi' family object,
but it seems to me
2008 Apr 02
2
Overdispersion in count data
Hi all,
I have count data (number of flowering individuals plus total number of
individuals) across 24 sites and 3 treatments (time since last burn).
Following recommendations in the R Book, I used a glm with the model y~
burn, with y being two columns (flowering, not flowering) and burn the time
(category) since burn. However, the residual deviance is roughly 10 times
the number of degrees of
2005 Mar 02
1
Leaps & regsubsets
Hello
I am trying to use all subsets regression on a test dataset consisting
of 11 trails and 46 potential predictor variables.
I would like to use Mallow's Cp as a selection criterion.
The leaps function would provide the required output but does not work
with this many variables (see below).
The alternative function regsubsets should be used, but I am not able to
define the function in
2006 Feb 24
1
Extracting information from factanal()
Dear list members,
I apologize for putting this (probably) very basic question on the
mailing list. I have scanned through the R website (using search) but
did not found an answer.
(code included below)
A factor matrix is simply extracted (which can then subsequently be
exported using write.table) by FACT$loadings[1:6,].
I would also like to specifically extract and export
2003 Feb 18
4
glm and overdispersion
Hi,
I am performing glm with binomial family and my data show slight
overdispersion (HF<1.5). Nevertheless, in order to take into account for
this heterogeneity though weak, I use F-test rather than Chi-square
(Krackow & Tkadlec, 2001). But surprisingly, outputs of this two tests
are exactly similar. What is the reason and how can I scale the output
by overdispersion ??
Thank you,
2005 May 11
2
Regsubsets()
Dear List members
I am using the regsubsets function to select a few predictor variables
using Mallow's Cp:
> sel.proc.regsub.full <- regsubsets(CO2 ~ v + log(v) + v.max + sd.v +
tad + no.stops.km + av.stop.T + a + sd.a + a.max + d + sd.d + d.max +
RPA + P + perc.stop.T + perc.a.T + perc.d.T + RPS + RPSS + sd.P.acc +
P.dec + da.acc.1 + RMSACC + RDI + RPSI + P.acc + cov.v + cov.a +
2005 May 06
3
conversion factor into numeric
Thank you all for your (fast) comments.
Unfortunately I could not make the advise work:
> mass
[1] 800 800 800 800 800 800 800 800 800 800 800
800 800 800 800 800 800 800 910 910 910 910 910
910 910
[26] 910 910 910 910 910 910 910 910 910 910 910
910 910 910 1,020 1,020 1,020 1,020 1,020 1,020 1,020 1,020 1,020
2009 Feb 16
1
Overdispersion with binomial distribution
I am attempting to run a glm with a binomial model to analyze proportion
data.
I have been following Crawley's book closely and am wondering if there is
an accepted standard for how much is too much overdispersion? (e.g. change
in AIC has an accepted standard of 2).
In the example, he fits several models, binomial and quasibinomial and then
accepts the quasibinomial.
The output for residual
2004 Jul 20
5
Precision in R
Greetings.
I'm trying to recreate in R some regression models I've done in SAS,
but I'm not getting the same results. My advisor suspects this may be
due to differences in precision between R and SAS. Does anyone know
where I can find specifications for R's type double? (It doesn't seem
to be in the R Language Definition.) Thanks in advance for any help
anyone can
2007 Mar 19
1
likelihoods in SAS GENMOD vs R glm
List: I'm helping a colleague with some Poisson regression modeling. He
uses SAS proc GENMOD and I'm using glm() in R. Note on the SAS and R
output below that our estimates, standard errors, and deviances are
identical but what we get for likelihoods differs considerably. I'm
assuming that these must differ just by some constant but it would be nice
to have some confirmation