Displaying 20 results from an estimated 20000 matches similar to: "nlme questions (e.g., specifying group membership, changing options)"
2005 Aug 29
1
lme and ordering of terms
Dear R users,
When fitting a lme() object (from the nlme library), is it possible to
test interactions *before* main effects? As I understand, R
conventionally re-orders all terms such that highest-order interactions
come last - but I??d like to know if it??s possible (and sensible) to
change this ordering of terms.
I??ve tried the terms() command (from aov) but I don??t know if something
2001 Jun 15
1
contrasts in lm and lme
I am using RW 1.2.3. on an IBM PC 300GL.
Using the data bp.dat which accompanies
Helen Brown and Robin Prescott
1999 Applied Mixed Models in Medicine. Statistics in Practice.
John Wiley & Sons, Inc., New York, NY, USA
which is also found at www.med.ed.ac.uk/phs/mixed. The data file was opened
and initialized with
> dat <- read.table("bp.dat")
>
2005 Apr 13
2
multinom and contrasts
Hi,
I found that using different contrasts (e.g.
contr.helmert vs. contr.treatment) will generate
different fitted probabilities from multinomial
logistic regression using multinom(); while the fitted
probabilities from binary logistic regression seem to
be the same. Why is that? and for multinomial logisitc
regression, what contrast should be used? I guess it's
helmert?
here is an example
2005 Jun 23
4
contrats hardcoded in aov()?
On 6/23/05, RenE J.V. Bertin <rjvbertin at gmail.com> wrote:
> Hello,
>
> I was just having a look at the aov function source code, and see that when the model used does not have an Error term, Helmert contrasts are imposed:
>
> if (is.null(indError)) {
> ...
> }
> else {
> opcons <- options("contrasts")
>
2006 Aug 22
1
summary(lm ... conrasts=...)
Hi Folks,
I've encountered something I hadn't been consciously
aware of previously, and I'm wondering what the
explanation might be.
In (on another list) using R to demonstrate the difference
between different contrasts in 'lm' I set up an example
where Y is sampled from three different normal distributions
according to the levels ("A","B","C")
2004 Mar 03
1
Confusion about coxph and Helmert contrasts
Hi,
perhaps this is a stupid question, but i need some help about
Helmert contrasts in the Cox model.
I have a survival data frame with an unordered factor `group'
with levels 0 ... 5.
Calculating the Cox model with Helmert contrasts, i expected that
the first coefficient would be the same as if i had used treatment
contrasts, but this is not true.
I this a error in reasoning, or is it
2005 Apr 23
2
ANOVA with both discreet and continuous variable
Hi all,
I have dataset with 2 independent variable, one (x1)
is continuous, the other (x2) is a categorical
variable with 2 levels. The dependent variable (y) is
continuous. When I run linear regression y~x1*x2, I
found that the p value for the continuous independent
variable x1 changes when different contrasts was used
(helmert vs. treatment), while the p values for the
categorical x2 and
2007 Jun 28
2
aov and lme differ with interaction in oats example of MASS?
Dear R-Community!
The example "oats" in MASS (2nd edition, 10.3, p.309) is calculated for aov and lme without interaction term and the results are the same.
But I have problems to reproduce the example aov with interaction in MASS (10.2, p.301) with lme. Here the script:
library(MASS)
library(nlme)
options(contrasts = c("contr.treatment", "contr.poly"))
# aov: Y ~
1999 Oct 22
1
factors in glm
Is there any logical reason why glm prints out the labels of factor
levels after variable names when baseline contrasts (contr.treatment)
are used but the codes for the levels when mean contrasts (contr.sum)
are used? Jim
-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-
r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html
Send "info",
2003 Feb 14
5
Translating lm.object to SQL, C, etc function
This is my first post to this list so I suppose a quick intro is in
order. I've been using SPLUS 2000 and R1.6.2 for just a couple of days,
and love S already. I'm reading MASS and also John Fox's book - both have
been very useful. My background in stat software was mainly SPSS (which
I've never much liked - thanks heavens I've found S!), and Perl is my
tool of choice for
2005 Aug 15
1
error in predict glm (new levels cause problems)
Dear R-helpers,
I try to perform glm's with negative binomial distributed data.
So I use the MASS library and the commands:
model_1 = glm.nb(response ~ y1 + y2 + ...+ yi, data = data.frame)
and
predict(model_1, newdata = data.frame)
So far, I think everything should be ok.
But when I want to perform a glm with a subset of the data,
I run into an error message as soon as I want to predict
2006 Apr 19
1
Can't run code from "Mixed Effects Models in S and S-plus"
Dear R-users:
I can't run the following code from "Mixed Effects Models in S and S-plus".
library( nlme )
options( width = 65, digits = 5 )
options( contrasts = c(unordered = "contr.helmert", ordered = "contr.poly")
)
# Chapter 5 Extending the Basic Linear Mixed-Effects Models
# 5.1 General Formulation of the Extended Model
data( Orthodont )
vf1Fixed
2009 Nov 16
2
fitting a logistic regression with mixed type of variables
Hi,
I am trying to fit a logistic regression using glm, but my explanatory
variables are of mixed type: some are numeric, some are ordinal, some are
categorical, say
If x1 is numeric, x2 is ordinal, x3 is categorical, is the following formula
OK?
*model <- glm(y~x1+x2+x3, family=binomial(link="logit"), na.action=na.pass)*
*
*
*Thanks,*
*
*
*-Jack*
[[alternative HTML version
2008 Sep 26
1
Type I and Type III SS in anova
Hi all,
I have been trying to calculate Type III SS in R for an unbalanced two-way
anova. However, the Type III SS are lower for the first factor compared to
type I but higher for the second factor (see below). I have the impression
that Type III are always lower than Type I - is that right?
And a clarification about how to fit Type III SS. Fitting model<-aov(y~a*b)
in the base package and
2008 Nov 14
1
aov help
Please pardon an extremely naive question. I see related earlier
posts, but no responses which answer my particular question. In
general, I'm very confused about how to do variance decomposition with
random and mixed effects. Pointers to good tutorials or texts would
be greatly appreciated.
To give a specific example, page 193 of V&R, 3d Edition, illustrates
using raov assuming pure
2000 Aug 01
1
Testing for parallel slopes
I'm running a series of simple bivariate linear regressions on grouped
data. I want to test the slopes to see if they are parallel. I normally
use analysis of covariance to do so, looking at interaction between the
covariate and the factor to make this determination.
VR3 pp.149 - 154 has a very nice example of an ANOCOVA, ending with a
discussion of this very operation.
My question has
2009 Nov 08
2
reference on contr.helmert and typo on its help page.
I'm wondering which textbook discussed the various contrast matrices
mentioned in the help page of 'contr.helmert'. Could somebody let me
know?
BTW, in R version 2.9.1, there is a typo on the help page of
'contr.helmert' ('cont.helmert' should be 'contr.helmert').
2012 Oct 27
1
contr.sum() and contrast names
Hi!
I would like to suggest to make it possible, in one way or another, to
get meaningful contrast names when using contr.sum(). Currently, when
using contr.treatment(), one gets factor levels as contrast names; but
when using contr.sum(), contrasts are merely numbered, which is not
practical and can lead to mistakes (see code at the end of this
message).
This issue was discussed quickly in 2005
2005 Feb 23
1
model.matrix for a factor effect with no intercept
I was surprised by this (in R 2.0.1):
> a <- ordered(-1:1)
> a
[1] -1 0 1
Levels: -1 < 0 < 1
> model.matrix(~ a)
(Intercept) a.L a.Q
1 1 -7.071068e-01 0.4082483
2 1 -9.073800e-17 -0.8164966
3 1 7.071068e-01 0.4082483
attr(,"assign")
[1] 0 1 1
attr(,"contrasts")
attr(,"contrasts")$a
[1]
2005 Nov 24
2
type III sums of squares in R
Hi everyone,
Can someone explain me how to calculate SAS type III sums of squares in
R? Not that I would like to use them, I know they are problematic. I
would like to know how to calculate them in order to demonstrate that
strange things happen when you use them (for a course for example). I
know you can use drop1(lm(), test="F") but for an lm(y~A+B+A:B), type
III SSQs are only