Displaying 20 results from an estimated 1000 matches similar to: "degrees of freedom in lme"
2007 Nov 09
1
Confidence Intervals for Random Effect BLUP's
I want to compute confidence intervals for the random effect estimates
for each subject. From checking on postings, this is what I cobbled
together using Orthodont data.frame as an example. There was some
discussion of how to properly access lmer slots and bVar, but I'm not
sure I understood. Is the approach shown below correct?
Rick B.
# Orthodont is from nlme (can't have both nlme and
2006 Jun 05
1
Extracting Variance components
I can ask my question using and example from Chapter 1 of Pinheiro & Bates.
> # 1.4 An Analysis of Covariance Model
>
> OrthoFem <- Orthodont[ Orthodont$Sex == "Female", ]
> fm1OrthF <-
+ lme( distance ~ age, data = OrthoFem, random = ~ 1 | Subject )
> summary( fm1OrthF )
Linear mixed-effects model fit by REML
Data: OrthoFem
AIC BIC
2005 Jul 12
1
nlme plot
Hello,
I am running this script from Pinheiro & Bates book in R Version 2.1.1 (WinXP).
But, I can't plot Figure 2.3.
What's wrong?
TIA.
Rod.
---------------------------------------------------------
>library(nlme)
> names( Orthodont )
[1] "distance" "age" "Subject" "Sex"
> levels( Orthodont$Sex )
[1] "Male"
2006 Mar 21
1
Scaling behavior ov bVar from lmer models
Hi all,
To follow up on an older thread, it was suggested that the following
would produce confidence intervals for the estimated BLUPs from a linear
mixed effect model:
OrthoFem<-Orthodont[Orthodont$Sex=="Female",]
fm1OrthF. <- lmer(distance~age+(age|Subject), data=OrthoFem)
fm1.s <- coef(fm1OrthF.)$Subject
fm1.s.var <- fm1OrthF. at bVar$Subject
fm1.s0.s <-
2006 Nov 28
2
Problem with pairs() in nlme
Dear r-helpers,
After successfully running
require(nlme)
vfr.lmL <- lmList(
estimate ~ (slant + respType + visField + hand)^2 | subject, vfr
)
pairs(vfr.lmL, id = 0.01, adj = -0.5) # Pinheiro & Bates (p. 141)
produces the following error:
Error in sprintf(gettext(fmt, domain = domain), ...) :
object "form" not found
Any guesses as to what I may have done wrong?
2005 Dec 22
2
bVar slot of lmer objects and standard errors
Hello,
I am looking for a way to obtain standard errors for emprirical Bayes estimates of a model fitted with lmer (like the ones plotted on page 14 of the document available at http://www.eric.ed.gov/ERICDocs/data/ericdocs2/content_storage_01/0000000b/80/2b/b3/94.pdf). Harold Doran mentioned (http://tolstoy.newcastle.edu.au/~rking/R/help/05/08/10638.html) that the posterior modes' variances
2004 Aug 27
2
degrees of freedom (lme4 and nlme)
Hi, I'm having some problems regarding the packages
lme4 and nlme, more specifically in the denominator
degrees of freedom. I used data Orthodont for the two
packages. The commands used are below.
require(nlme)
data(Orthodont)
fm1<-lme(distance~age+ Sex,
data=Orthodont,random=~1|Subject, method="REML")
anova(fm1)
numDF DenDF F-value p-value
(Intercept) 1
2008 Mar 19
1
analyzing binomial data with spatially correlated errors
Dear R users,
I want to explain binomial data by a serie of fixed effects. My problem is
that my binomial data are spatially correlated. Naively, I thought I could
found something similar to gls to analyze such data. After some reading, I
decided that lmer is probably to tool I need. The model I want to fit would
look like
lmer ( cbind(n.success,n.failure) ~ (x1 + x2 + ... + xn)^2 ,
2008 Oct 10
1
glmmPQL
Dear all,
I am experiencing problems with glmmmPQL. I am trying to analyze
binomial data with some spatial autocorrelation. Here is my code and
some of the outputs
> colnames(d.glmm)
[1] "BV" "Longitude" "Latitude" "nb_pc_02" "nb_expr_02"
[6] "pc_02" "nb_pc_07" "nb_expr_07"
2011 Aug 27
1
Degrees of freedom in the Ljung-Box test
Dear list members,
I have 982 quotations of a given stock index and I want to run a Ljung-Box
test on these data to test for autocorrelation. Later on I will estimate 8
coefficients.
I do not know how many degrees of freedom should I assume in the formula for
Ljung-Box test. Could anyone tell me please?
Below the formula:
Box.test(x, lag = ????, type = c("Ljung-Box"), fitdf = 0)
2010 Apr 03
0
Multilevel model with lme(): Weird degrees of freedom (group level df > # of groups)
Hello everyone,
I am trying to regress applicants' performance in an assessment center
(AC) on their gender (individual level) and the size of the AC (group
level) with a multi-level model:
model.0 <- lme(performance ~ ACsize + gender, random = ~1 | ACNumber,
method = "ML", control = list(opt = "optim"))
I have 1047 applicants in 118 ACs:
>
2007 Jun 14
0
How to set degrees of freedom in cor.test?
Hello,
I want to compute a correlation test but I do not want to use the
degrees of freedom that are calculated by default but I want to set a
particular number of degrees of freedom.
I looked in the manual, different other functions but I did not found
how to do it
Thanks in advance for your answers
Yours
Florence Dufour
PhD Student
AZTI Tecnalia - Spain
2009 Jan 07
1
Extracting degrees of freedom from a gnls object
Dear all,
How can I extract the total and residual d.f. from a gnls object?
I have tried str(summary(gnls.model)) and str(gnls.model) as well as gnls(), but couldn?t find the
entry in the resulting lists.
Many thanks!
Best wishes
Christoph
--
Dr. rer.nat. Christoph Scherber
University of Goettingen
DNPW, Agroecology
Waldweg 26
D-37073 Goettingen
Germany
phone +49 (0)551 39 8807
fax +49
2001 Jul 19
0
Correction of degrees of freedom in repeated measure aov
Hi there,
some statistical programs (e.g. SPSS) calculate a correction of the
degrees of freedom in a repeated measure analysis of variance (see
Greenhouse-Geisser (1958) or Huynh-Feld (1976)) by a factor epsilon.
This factor is used to correct the deg. of freedom to get a corrected
f-test. Is this also possible with R?
Thanks, Sven
P.S.: I read in the lm help page:
singular.ok logical,
2008 Mar 04
0
using Chi-square test with a certain number of degrees of freedom ?
Hi all,
Could someone please help me to calculate the P-value by using Chi-square test with a certain number of degrees of freedom?
I have a data set to be calculated here:
observed: 224, 64, 6
expected: 222.9, 66.2, 4.9
degrees of freedom: 1
I have been reading the documentations for three days, and can't find the answers.
Please help.Thanks in advance.
Regards,
Frank
2002 Apr 24
0
degrees of freedom for t-tests in lme
Hi,
I have trouble to figure out how the df is derived in LME. Here is my
model,
lme(y~x+log(den)+sex+dep,data=lwd,random= list(group=~x))
Number of total samples (N) is 3237
number of groups (J) is 26
number of level-1 variables (Q1) is 3, i.e., x, log(den) and sex
number of level-2 variables (Q2) is 1, i.e., dep
x and den are continuous variable
sex is associated with individual samples
2006 Jan 26
0
degrees freedom in nlme
I'm having hard time understanding the computation of degrees of freedom when runing nlme () on the following model:
> formula(my data.gd)
dLt ~ Lt | ID
TasavB<- function(Lt, Linf, K) (K*(Linf-Lt))
my model.nlme <- nlme (dLt ~ TasavB(Lt, Linf, K),
data = my data.gd,
fixed = list(Linf ~ 1, K ~ 1),
start = list(fixed = c(70, 0.4)),
na.action= na.include,
2006 Feb 26
1
changing degrees of freedom in summary.lm()
Hello all,
I'm trying to do a nested linear model with a dataset that incorporates
an observation for each of several classes within each of several plots.
I have 219 plots, and 17 classes within each plot.
data.frame has columns "plot","class","age","dep.var"
With lm(dep.var~class*age),
The summary(lm) function returns t-test and F-test values
2006 Mar 08
1
Degrees of freedom using Box.test()
After an RSiteSeach("Box.test") I found some discussion regarding the degrees
of freedom in the computation of the Ljung-Box test using Box.test(), but did
not find any posting about the proper degrees of freedom.
Box.test() uses "lag=number" as the degrees of freedom. However, I believe
the correct degrees of freedom should be "number-p-q" where p and q are
2006 Nov 01
1
gamm(): degrees of freedom of the fit
I wonder whether any of you know of an efficient way to calculate the approximate degrees of freedom of a gamm() fit.
Calculating the smoother/projection matrix S: y -> \hat y and then its trace by sum(eigen(S))$values is what I've been doing so far- but I was hoping there might be a more efficient way than doing the spectral decomposition of an NxN-matrix.
The degrees of freedom