Displaying 20 results from an estimated 1100 matches similar to: "lme and ordering of terms"
2005 Mar 10
1
contrast matrix for aov
How do we specify a contrast interaction matrix for an ANOVA model?
We have a two-factor, repeated measures design, with
Cue Direction (2) x Brain Hemisphere(2)
Each of these has 2 levels, 'left' and 'right', so it's a simple 2x2 design
matrix. We have 8 subjects in each cell (a balanced design) and we want to
specify the interaction contrast so that:
CueLeft>CueRght
2009 Jul 07
3
Error due to non-conformable arrays
Hello,
Consider this function for generalized ridge regression:
gre <- function (X,y,D){
n <- dim(X)[1]
p <- dim(X)[2]
intercept <- rep(1, n)
X <- cbind(intercept, X)
X2D <- crossprod(X,X)+ D
Xy <- crossprod(X,y)
bth <- qr.solve(X2D, Xy)
}
# suppose X is an (nxp) design matrix and y is an (nx1) response vector
p <- dim(x)[2]
D<- diag(rep(1.5,p))
bt
2009 Nov 08
2
reference on contr.helmert and typo on its help page.
I'm wondering which textbook discussed the various contrast matrices
mentioned in the help page of 'contr.helmert'. Could somebody let me
know?
BTW, in R version 2.9.1, there is a typo on the help page of
'contr.helmert' ('cont.helmert' should be 'contr.helmert').
2005 Apr 13
2
multinom and contrasts
Hi,
I found that using different contrasts (e.g.
contr.helmert vs. contr.treatment) will generate
different fitted probabilities from multinomial
logistic regression using multinom(); while the fitted
probabilities from binary logistic regression seem to
be the same. Why is that? and for multinomial logisitc
regression, what contrast should be used? I guess it's
helmert?
here is an example
2005 Jun 23
4
contrats hardcoded in aov()?
On 6/23/05, RenE J.V. Bertin <rjvbertin at gmail.com> wrote:
> Hello,
>
> I was just having a look at the aov function source code, and see that when the model used does not have an Error term, Helmert contrasts are imposed:
>
> if (is.null(indError)) {
> ...
> }
> else {
> opcons <- options("contrasts")
>
2006 Aug 22
1
summary(lm ... conrasts=...)
Hi Folks,
I've encountered something I hadn't been consciously
aware of previously, and I'm wondering what the
explanation might be.
In (on another list) using R to demonstrate the difference
between different contrasts in 'lm' I set up an example
where Y is sampled from three different normal distributions
according to the levels ("A","B","C")
2004 Mar 03
1
Confusion about coxph and Helmert contrasts
Hi,
perhaps this is a stupid question, but i need some help about
Helmert contrasts in the Cox model.
I have a survival data frame with an unordered factor `group'
with levels 0 ... 5.
Calculating the Cox model with Helmert contrasts, i expected that
the first coefficient would be the same as if i had used treatment
contrasts, but this is not true.
I this a error in reasoning, or is it
2005 Apr 23
2
ANOVA with both discreet and continuous variable
Hi all,
I have dataset with 2 independent variable, one (x1)
is continuous, the other (x2) is a categorical
variable with 2 levels. The dependent variable (y) is
continuous. When I run linear regression y~x1*x2, I
found that the p value for the continuous independent
variable x1 changes when different contrasts was used
(helmert vs. treatment), while the p values for the
categorical x2 and
2001 Jun 15
1
contrasts in lm and lme
I am using RW 1.2.3. on an IBM PC 300GL.
Using the data bp.dat which accompanies
Helen Brown and Robin Prescott
1999 Applied Mixed Models in Medicine. Statistics in Practice.
John Wiley & Sons, Inc., New York, NY, USA
which is also found at www.med.ed.ac.uk/phs/mixed. The data file was opened
and initialized with
> dat <- read.table("bp.dat")
>
1997 May 06
1
R-beta: formula() and model formulae
Several bugs (no solutions, yet). These might be well known.
1) If one does, e.g., mymod <- lm(y ~ x); formula(mymod)
then one does not get back the formula (one gets, Error: invalid formula)
2) if x is of mode numeric, then the model formula
mymod <- lm(y ~ x + x^2)
is not processed as S would do it. The model is fit ignoring the x^2 term,
however mymod$call includes the x^2 term.
1997 May 06
1
R-beta: formula() and model formulae
Several bugs (no solutions, yet). These might be well known.
1) If one does, e.g., mymod <- lm(y ~ x); formula(mymod)
then one does not get back the formula (one gets, Error: invalid formula)
2) if x is of mode numeric, then the model formula
mymod <- lm(y ~ x + x^2)
is not processed as S would do it. The model is fit ignoring the x^2 term,
however mymod$call includes the x^2 term.
1999 Oct 22
1
factors in glm
Is there any logical reason why glm prints out the labels of factor
levels after variable names when baseline contrasts (contr.treatment)
are used but the codes for the levels when mean contrasts (contr.sum)
are used? Jim
-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-
r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html
Send "info",
2005 Aug 15
1
error in predict glm (new levels cause problems)
Dear R-helpers,
I try to perform glm's with negative binomial distributed data.
So I use the MASS library and the commands:
model_1 = glm.nb(response ~ y1 + y2 + ...+ yi, data = data.frame)
and
predict(model_1, newdata = data.frame)
So far, I think everything should be ok.
But when I want to perform a glm with a subset of the data,
I run into an error message as soon as I want to predict
2009 Nov 16
2
fitting a logistic regression with mixed type of variables
Hi,
I am trying to fit a logistic regression using glm, but my explanatory
variables are of mixed type: some are numeric, some are ordinal, some are
categorical, say
If x1 is numeric, x2 is ordinal, x3 is categorical, is the following formula
OK?
*model <- glm(y~x1+x2+x3, family=binomial(link="logit"), na.action=na.pass)*
*
*
*Thanks,*
*
*
*-Jack*
[[alternative HTML version
2000 Aug 01
1
Testing for parallel slopes
I'm running a series of simple bivariate linear regressions on grouped
data. I want to test the slopes to see if they are parallel. I normally
use analysis of covariance to do so, looking at interaction between the
covariate and the factor to make this determination.
VR3 pp.149 - 154 has a very nice example of an ANOCOVA, ending with a
discussion of this very operation.
My question has
2003 Feb 14
5
Translating lm.object to SQL, C, etc function
This is my first post to this list so I suppose a quick intro is in
order. I've been using SPLUS 2000 and R1.6.2 for just a couple of days,
and love S already. I'm reading MASS and also John Fox's book - both have
been very useful. My background in stat software was mainly SPSS (which
I've never much liked - thanks heavens I've found S!), and Perl is my
tool of choice for
2006 May 11
2
greco-latin square
Hi,
I am analyzing a repeated-measures Greco-Latin Square with the aov command.
I am using aov to calculate the MSs and then picking by hand the appropriate
neumerator and denominator terms for the F tests.
The data are the following:
responseFinger
mapping.code Subject.n index middle ring
little
----------------------------------------------------------------------------
1 1
2008 Sep 26
1
Type I and Type III SS in anova
Hi all,
I have been trying to calculate Type III SS in R for an unbalanced two-way
anova. However, the Type III SS are lower for the first factor compared to
type I but higher for the second factor (see below). I have the impression
that Type III are always lower than Type I - is that right?
And a clarification about how to fit Type III SS. Fitting model<-aov(y~a*b)
in the base package and
2006 May 30
1
when dimensionality is larger than the number of observations?
Hi, there:
Can anyone here kindly point some good reference or links on this topic?
Esp. some solutions from BioConductor or R, when dealing with
microarray-like, "fat" data?
thanks,
--
Weiwei Shi, Ph.D
"Did you always know?"
"No, I did not. But I believed..."
---Matrix III
[[alternative HTML version deleted]]
2005 Nov 24
2
type III sums of squares in R
Hi everyone,
Can someone explain me how to calculate SAS type III sums of squares in
R? Not that I would like to use them, I know they are problematic. I
would like to know how to calculate them in order to demonstrate that
strange things happen when you use them (for a course for example). I
know you can use drop1(lm(), test="F") but for an lm(y~A+B+A:B), type
III SSQs are only