Displaying 20 results from an estimated 1000 matches similar to: "degrees of freedom (lme4 and nlme)"
2005 Jan 03
1
different DF in package nlme and lme4
Hi all
I tried to reproduce an example with lme and used the Orthodont
dataset.
library(nlme)
fm2a.1 <- lme(distance ~ age + Sex, data = Orthodont, random = ~ 1 | Subject)
anova(fm2a.1)
> numDF denDF F-value p-value
> (Intercept) 1 80 4123.156 <.0001
> age 1 80 114.838 <.0001
> Sex 1 25 9.292 0.0054
or alternatively
2007 Nov 09
1
Confidence Intervals for Random Effect BLUP's
I want to compute confidence intervals for the random effect estimates
for each subject. From checking on postings, this is what I cobbled
together using Orthodont data.frame as an example. There was some
discussion of how to properly access lmer slots and bVar, but I'm not
sure I understood. Is the approach shown below correct?
Rick B.
# Orthodont is from nlme (can't have both nlme and
2005 Jun 08
0
bug in predict.lme?
Dear All,
I've come across a problem in predict.lme. Assigning a model formula to a variable and then using this variable in lme (instead of typing the formula into the formula part of lme) works as expect. However, when performing a predict on the fitted model I gan an error messag - predict.lme (but not predictlm) seems to expect a 'properly' typed in formula and a cannot extract
2006 Mar 21
1
Scaling behavior ov bVar from lmer models
Hi all,
To follow up on an older thread, it was suggested that the following
would produce confidence intervals for the estimated BLUPs from a linear
mixed effect model:
OrthoFem<-Orthodont[Orthodont$Sex=="Female",]
fm1OrthF. <- lmer(distance~age+(age|Subject), data=OrthoFem)
fm1.s <- coef(fm1OrthF.)$Subject
fm1.s.var <- fm1OrthF. at bVar$Subject
fm1.s0.s <-
2007 Jun 25
1
degrees of freedom in lme
Dear all,
I am starting to use the lme package (and plan to teach a course based on it
next semester...). To understand what lme is doing precisely, I used balanced
datasets described in Pinheiro and Bates and tried to compare the lme outputs
to that of aov. Here is what I obtained:
> data(Machines)
> summary(aov(score~Machine+Error(Worker/Machine),data=Machines))
Error: Worker
2010 Oct 18
1
Question about lme (mixed effects regression)
Hello!
If I run this example:
library(nlme)
fm1 <- lme(distance ~ age+Sex, Orthodont, random = ~ age + Sex| Subject)
If I run:
summary(fm1)
then I can see the fixed effects for age and sex (17.7 for intercept,
0.66 for age, and -1.66 for SexFemale)
If I run:
ranef(fm1)
Then it looks like it's producing the random effects for each subgroup
(in this example - each subject). For example,
2004 Apr 17
0
nlme - sum of squares - permutation test
Hi,
1/ I wonder why a anova.lme on a single lme object does not print the sum of squares (as expected from the help: "a data frame with the sums of squares, numerator degrees of freedom, denominator
degrees of freedom, F-values, and P-values").
Example:
> fm2 <- lme(distance ~ age + Sex, data = Orthodont, random = ~ 1)
> anova(fm2)
numDF denDF F-value p-value
2005 Nov 25
0
multiple imputation of anova tables
Dear list members,
how can multiple imputation realized for anova tables in R? Concretely,
how to combine
F-values and R^2, R^2_adjusted from multiple imputations in R?
Of course, the point estimates can be averaged, but how to get
standarderrors for F-values/R^2 etc. in R?
For linear models, lm.mids() works well, but according to Rubins rules,
standard errors have to be used together with
2019 Jan 17
3
long-standing documentation bug in ?anova.lme
tl;dr anova.lme() claims to provide sums of squares, but it doesn't. And
some names are misspelled in ?lme. I can submit all this stuff as a bug
report if that's preferred.
?anova.lme says:
When only one fitted model object is present, a data frame with
the sums of squares, numerator degrees of freedom, denominator
degrees of freedom, F-values, and P-values
The output of
fm1
2007 Jan 25
1
summary of the effects after logistic regression model
Dear all, my aim is to estimate the efficacy over time of a treatment for
headache prevention. Data consist of long sequences of repeated binary
outcomes (1 if the subject has at least 1 episode of headache , 0
otherwise) on subjects randomized to placebo or treatment.
I have fit a logistic regression model with Huber-White cluster sandwich
covariance estimator.
I have put in the model the
2009 Jul 12
2
Heckman Selection MOdel Help in R
Hi Saurav!
On Sun, Jul 12, 2009 at 6:06 PM, Pathak,
Saurav<s.pathak08 at imperial.ac.uk> wrote:
> I am new to R, I have to do a 2 step Heckman model, my selection equation is
> below which I was successful in running but I am unable to proceed further,
>
>
>
> I have so far used the following command
>
> glm(formula = s ~ age + gender + gemedu + gemhinc + es_gdppc +
2005 Dec 22
2
bVar slot of lmer objects and standard errors
Hello,
I am looking for a way to obtain standard errors for emprirical Bayes estimates of a model fitted with lmer (like the ones plotted on page 14 of the document available at http://www.eric.ed.gov/ERICDocs/data/ericdocs2/content_storage_01/0000000b/80/2b/b3/94.pdf). Harold Doran mentioned (http://tolstoy.newcastle.edu.au/~rking/R/help/05/08/10638.html) that the posterior modes' variances
2006 Jul 08
1
denominator degrees of freedom and F-values in nlme
Hello,
I am struggling to understand how denominator degrees of freedom and
subsequent significance testing based upon them works in nlme models.
I have a data set of 736 measurements (weight), taken within 3
different age groups, on 497 individuals who fall into two
morphological catagories (horn types).
My model is: Y ~ weight + horn type / age group, random=~1|individual
I am modeling
2019 Jan 21
0
long-standing documentation bug in ?anova.lme
>>>>> Ben Bolker
>>>>> on Thu, 17 Jan 2019 12:32:20 -0500 writes:
> tl;dr anova.lme() claims to provide sums of squares, but it doesn't. And
> some names are misspelled in ?lme. I can submit all this stuff as a bug
> report if that's preferred.
> ?anova.lme says:
> When only one fitted model object is present, a data
2003 Jun 26
3
degrees of freedom in a LME model
Dear All,
I am analysing some data for a colleague (not my data, gotta be published
so I cannot divulge).
My response variable is the number of matings observed per day for some
fruitlies.
My factors are:
Day: the observations were taken on 9 days
Regime: 3 selection regimes
Line: 3 replicates per selection regime.
I have 81 observations in total
The lines are coded A to I, so I do not need
2002 Apr 26
0
[Fwd: Re: degrees of freedom for t-tests in lme]
Sorry, by mistake I sent this to Professor Bates instead of r-help.
Han
-------- Original Message --------
Subject: Re: [R] degrees of freedom for t-tests in lme
Date: Thu, 25 Apr 2002 09:16:16 -0700
From: Han-Lin Lai <Han-Lin.Lai at noaa.gov>
To: Douglas Bates <bates at stat.wisc.edu>
References: <3CC6E87F.5400277D at noaa.gov>
<6rg01lottu.fsf at franz.stat.wisc.edu>
2008 Jul 18
2
column wise paste of data.frames
Hi everybody!
I'm sure that I overlook something and feel quite stupid to ask, but I
have not found an easy solution to the following problem: Take e.g. the
Orthodont data from the nlme package:
> head(Orthodont)
Grouped Data: distance ~ age | Subject
distance age Subject Sex
1 26.0 8 M01 Male
2 25.0 10 M01 Male
3 29.0 12 M01 Male
4 31.0 14 M01 Male
2010 Jun 22
2
xyplot: adding pooled regression lines to a paneled type="r" plot
Consider the following plot that shows separate regression lines ~ age
for each subject in the Pothoff-Roy Orthodont data,
with separate panels by Sex:
library(nlme)
#plot(Orthodont)
xyplot(distance ~ age|Sex, data=Orthodont, type='r', groups=Subject,
col=gray(.50),
main="Individual linear regressions ~ age")
I'd like to also show in each panel the pooled OLS
1999 Jun 02
0
Sv: lme problem ?
Dear Douglas Bates. I just downloaded the compiled version (I'm a poor Windows devil, not yet having found the time to move to a more advanced platform...) from NT- the files are dated 30.5-1999 so they are not old - and the problem persisted....wonder what I did wrong ?
R : Copyright 1999, The R Development Core Team
Version 0.64.0 Patched (May 3, 1999)
R is free software and comes with
1999 Nov 27
0
lme
Doug,
I thought perhaps that you might be interested in the comparison of
lme to the results for the same models fitted by Richard Jones' carma
(I just wrote the R interface to his Fortran code). The code to run
the example from the lme help and for the equivalent with carma is in
the file below.
The two main differences in results are
1. the random coefficients covariance matrix is quite