similar to: Want to fit random intercept in logistic regression (testing lmer and glmmML)

Displaying 20 results from an estimated 100 matches similar to: "Want to fit random intercept in logistic regression (testing lmer and glmmML)"

2004 Jun 14
1
glmmML package
I'm trying to use the glmmML package on a Windows machine. When I try to install the package, I get the message: > {pkg <- select.list(sort(.packages(all.available = TRUE))) + if(nchar(pkg)) library(pkg, character.only=TRUE)} Error in dyn.load(x, as.logical(local), as.logical(now)) : unable to load shared library
2006 Aug 21
1
New version of glmmML
A new version, 0.65-1, of glmmML is now on CRAN. It is a major rewrite of the inner structures, so frequent updates (bug fixes) may be expected for some time. News: * The Laplace and adaptive Gauss-Hermite approximations to the log likelihood function are fully implemented. The Laplace method is made the default. It should give results you can compare to the results from 'lmer' (for the
2006 Aug 21
1
New version of glmmML
A new version, 0.65-1, of glmmML is now on CRAN. It is a major rewrite of the inner structures, so frequent updates (bug fixes) may be expected for some time. News: * The Laplace and adaptive Gauss-Hermite approximations to the log likelihood function are fully implemented. The Laplace method is made the default. It should give results you can compare to the results from 'lmer' (for the
2010 Jan 23
1
(nlme, lme, glmmML, or glmmPQL)mixed effect models with large spatial data sets
Hi, I have a spatial data set with many observations (~50,000) and would like to keep as much data as possible. There is spatial dependence, so I am attempting a mixed model in R with a spherical variogram defining the correlation as a function of distance between points. I have tried nlme, lme, glmmML, and glmmPQL. In all case the matrix needed (seems to be (N^2)/2 - N) is too large for my
2006 Jul 12
0
glmmML updated
I have uploaded a new version (0.30-2) of glmmML to CRAN today. This is a rather extensive upgrade, mostly internal. Adaptive Gauss-Hermite quadrature (GHQ) is now used for the evaluation of the integrals in the log likelihood function. The user can choose the number of points (default is 16), I _think_ that choosing 1 point will result in a Laplace approximation. The integrals in the score and
2006 Jul 12
0
glmmML updated
I have uploaded a new version (0.30-2) of glmmML to CRAN today. This is a rather extensive upgrade, mostly internal. Adaptive Gauss-Hermite quadrature (GHQ) is now used for the evaluation of the integrals in the log likelihood function. The user can choose the number of points (default is 16), I _think_ that choosing 1 point will result in a Laplace approximation. The integrals in the score and
2007 Aug 07
0
help on glmmML
Hello! I am using glmmML for a logitic regression with random effect. I use the posterior.mode as an estimate for the random effects. These can be very different from the estimates obtained using SAS , NLMIXED in the random with out= option. (all the fixed and standard error of random effect estimators are almost identical) Can someone explain to me why is that. The codes I use: R:
2006 Jun 28
0
New version of glmmML (p-values!)
A new version of 'glmmML' (0.28-4) is uploaded to CRAN. The most important new feature is the possibility to get a p-value for the test of the hypothesis that the variance of the random effects is zero, on the wishlist of many R users these days! Note two things: (i) glmmML only treats random intercepts for binomial and poisson models, (ii) the p-value is calculated thru bootstrapping
2006 Jun 28
0
New version of glmmML (p-values!)
A new version of 'glmmML' (0.28-4) is uploaded to CRAN. The most important new feature is the possibility to get a p-value for the test of the hypothesis that the variance of the random effects is zero, on the wishlist of many R users these days! Note two things: (i) glmmML only treats random intercepts for binomial and poisson models, (ii) the p-value is calculated thru bootstrapping
2011 Jun 22
2
error using glmmML()
Dear all, This question is basic but I am stumped. After running the below, I receive the message: "non-integer #successes in a binomial glm!" model1 <- glmmML(y~Brood.Size*Density+Date.Placed+Species+Placed.Emerging+Year+rate.of.parperplot, data = data, cluster= data$Patch, family=binomial(link="logit")) My response variable is sex ratio, and I have learned quickly not
2007 Aug 12
0
question on glmmML compared to NLMIXED
Hello! Can anyone help me. I am using the posterior.mode from the result of glmmML. It apears to be different from the BLUe estimate of the RANDOM statement in PROC NLMIXED in SAS. Why is that? Thank you Ronen [[alternative HTML version deleted]]
2006 Aug 21
0
R-packages posting guide (was: Re: [R-pkgs] New version of glmmML)
Maybe an R-packages posting guide with an example and an automatic append of a one or two line summary at the end of each article posted - as already done on r-help. On 8/21/06, Martin Maechler <maechler at stat.math.ethz.ch> wrote: > Hi G?ran, > > >>>>> "GB" == G?ran Brostr?m <goran.brostrom at gmail.com> > >>>>> on Mon, 21 Aug
2004 Nov 01
1
GLMM
Hello, I have a problem concerning estimation of GLMM. I used methods from 3 different packages (see program). I would expect similar results for glmm and glmmML. The result differ in the estimated standard errors, however. I compared the results to MASS, 4th ed., p. 297. The results from glmmML resemble the given result for 'Numerical integration', but glmm output differs. For the
2008 Dec 25
0
Class and object problem
Odette Gaston <odette.gaston <at> gmail.com> writes: > > Dear all, > > I have a problem with accessing class attributes. > I was unable to solve this > yet, but someone may know how to solve it. My best guess at your immediate problem (doing things by hand) is that you're not using the whole vector. From your example: Delta <- c(m1 = 0, m2 = 1.8, m3 =
2004 Apr 27
3
se.fit in predict.glm
Hi Folks, I'm seeking confirmation of something which is probably true but which I have not managed to find in the documentation. I have a binary response y={0.1} and a variable x and have fitted a probit response to the data with f <- glm( y~x, family=binomial(link=probit) ) and then, with a specified set of x-value X I have used the predict.glm function as p <- predict( f, X,
2004 Jan 30
0
GLMM (lme4) vs. glmmPQL output (summary with lme4 revised)
This is a summary and extension of the thread "GLMM (lme4) vs. glmmPQL output" http://maths.newcastle.edu.au/~rking/R/help/04/01/0180.html In the new revision (#Version: 0.4-7) of lme4 the standard errors are close to those of the 4 other methods. Thanks to Douglas Bates, Saikat DebRoy for the revision, and to G?ran Brostr?m who run a simulation. In response to my first posting, Prof.
2017 Feb 09
3
Ancient C /Fortran code linpack error
In my package 'glmmML' I'm using old C code and linpack in the optimizing procedure. Specifically, one part of the code looks like this: F77_CALL(dpoco)(*hessian, &bdim, &bdim, &rcond, work, info); if (*info == 0){ F77_CALL(dpodi)(*hessian, &bdim, &bdim, det, &job); ........ This usually works OK, but with an ill-conditioned data
2006 Jan 02
2
mixed effects models - negative binomial family?
Hello all, I would like to fit a mixed effects model, but my response is of the negative binomial (or overdispersed poisson) family. The only (?) package that looks like it can do this is glmm.ADMB (but it cannot run on Mac OS X - please correct me if I am wrong!) [1] I think that glmmML {glmmML}, lmer {Matrix}, and glmmPQL {MASS} do not provide this "family" (i.e. nbinom, or
2007 May 08
3
ordered logistic regression with random effects. Howto?
I'd like to estimate an ordinal logistic regression with a random effect for a grouping variable. I do not find a pre-packaged algorithm for this. I've found methods glmmML (package: glmmML) and lmer (package: lme4) both work fine with dichotomous dependent variables. I'd like a model similar to polr (package: MASS) or lrm (package: Design) that allows random effects. I was
2005 Dec 15
1
generalized linear mixed model by ML
Dear All, I wonder if there is a way to fit a generalized linear mixed models (for repeated binomial data) via a direct Maximum Likelihood Approach. The "glmm" in the "repeated" package (Lindsey), the "glmmPQL" in the "MASS" package (Ripley) and "glmmGIBBS" (Myle and Calyton) are not using the full maximum likelihood as I understand. The