Hi:
On Wed, Mar 31, 2010 at 2:31 PM, array chip <arrayprofile@yahoo.com>
wrote:
> Hi, I have very simple balanced randomized block design where I total have
> 48 observations of a measure of weights of a product, the product was
> manufactured at 4 sites, so each site has 12 observations. I want to use
> lme() from nlme package to estimate the standard error of the product
> weight.
>
It's a balanced one-way design where site is assumed to be a random factor.
If you want to call it a block, fine, but if this were a genuine RCBD, there
would be
treatments randomly assigned to 'units' within site, and that's not
present
here.
>
> So the data look like:
>
> MW site
> 1 54031 1
> 2 55286 1
> 3 54396 2
> 4 52327 2
> 5 55963 3
> 6 54893 3
> 7 57338 4
> 8 55597 4
> :
> :
> :
>
> The random effect model is: Y = mu + b + e where b is random block effect
> and e is model error.
>
> so I fitted a lme model as:
>
> obj<-lme(MW~1, random=~1|site, data=dat)
>
> summary(obj)
> Linear mixed-effects model fit by REML
> Random effects:
> Formula: ~1 | site
> (Intercept) Residual
> StdDev: 2064.006 1117.567
>
> Fixed effects: MW ~ 1
> Value Std.Error DF t-value p-value
> (Intercept) 55901.31 1044.534 44 53.51796 0
> :
> :
> Number of Observations: 48
> Number of Groups: 4
>
> I also did:
> anova(obj)
> numDF denDF F-value p-value
> (Intercept) 1 44 2864.173 <.0001
>
> I believe my standard error estimate is from "Residual" under
"Random
> Effects" part of summary(), which is 1117.567.
>
Yes.
>
> Now my question is regarding t test under "Fixed effects". I
think it's
> testing whether the over mean weight is 0 or not, which is not interesting
> anyway. But how the standard error of 1044.534 is calculated? I thought it
> should be sqrt(MSE)=1117.567 instead. anyone can explain?
>
When the data are balanced,
the population variance of \bar{y}.., the sample grand mean, is E(MSA)/N,
where
MSA is the between-site mean square and N is the total sample size (Searle,
Casella
and McCulloch, _Variance Components_, p. 54, formula (37) derived for the
balanced
data case - the corresponding ANOVA table, with expected mean squares, would
be
on p. 60). The plug-in estimate of E(MSA) is
MSA = n * s^2(Intercept) + s^2(error) = 12 * (2064.006)^2 + 1117.567^2,
where n = 12 = number of observations per site. The standard error for
\bar{y}.. is then
sqrt(MSA/N). Doing these calculations in R,
xx <- 12 * (2064.006)^2 + (1117.567)^2
sqrt(xx/48)
[1] 1044.533
which, within rounding error, is what lme() gives you in the test for fixed
effects.
> Same goes to the F test using anova(obj). The F test statistic is equal to
> square of the t test statistic because of 1 df of numerator. But what's
the
> mean sum of squares of numerator and denominator, where to find them? BTW,
I
> think denominator mean sum of squares (MSE) should be 1117.567^2, but this
> is not consistent to the standard error in the t test (1044.534).
>
lme() fits by ML or REML, so it doesn't output a conventional ANOVA table as
part of
the output. If you want to see the sums of squares and mean squares, use
aov(). In the
balanced one-way model, the observed df, SS and MS are the same in both the
fixed
effects and random effects models, but the expected mean square for
treatments differs
between the two models.
HTH,
Dennis
>
> Thanks a lot for any help
>
> John
>
> ______________________________________________
> R-help@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide
> http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.
>
[[alternative HTML version deleted]]