Displaying 20 results from an estimated 900 matches similar to: "Nonlinear regression question&In-Reply-To=6rya22mljx.fsf@franz.stat.wisc.edu"
2008 May 25
1
marginality principle / selecting the right type of SS for an interaction hypothesis
Hello,
I have a problem with selecting the right type of sums of squares for
an ANCOVA for my specific experimental data and hypotheses. I do have
a basic understanding of the differences between Type-I, II, and III
SSs, have read about the principle of marginality, and read Venable's
"Exegeses on Linear Models"
(http://www.stats.ox.ac.uk/pub/MASS3/Exegeses.pdf). I am pretty new to
2009 Dec 15
1
Type III sum of square in ANOVA
Dear all,
Does some body have idea on how to extract Type III sum of Square from "lm" or "aov" function in R ? I could not figure out it.
If this is minor and irrelevant to post in this mail, I am sorry for that.
Thanks.
Sincerely,
Ram Kumar Basnet
Wageningen University,
Netherlands
[[alternative HTML version deleted]]
2008 Jun 03
1
Model simplification using anova()
Hello all,
I've become confused by the output produced by a call to
anova(model1,model2). First a brief background. My model used to predict
final tree height is summarised here:
Df Sum Sq Mean Sq F value Pr(>F)
Treatment 2 748.35 374.17 21.3096 7.123e-06 ***
HeightInitial 1 0.31 0.31 0.0178 0.89519
2007 May 15
3
aov problem
I am using R to make two-way ANOVA on a number of variables using
g <- aov(var ~ fact1*fact2)
where var is a matrix containing the variables.
However the outcome seem to be dependent on the order of fact1 and fact2
(i.e. fact2*fact1) gives a slightly (factor of 1.5) different result.
Any ideas why this is?
Thanks for any help
Anders
2007 Oct 11
2
Type III sum of squares and appropriate contrasts
I am running a two-way anova with Type III sums of squares and would
like to be able to understand what the different SS mean when I use
different contrasts, e.g. treatment contrasts vs helmert contrasts. I
have read John Fox's "An R and S-Plus Companion to Applied Regression"
approach -p. 140- suggesting that treatment contrasts do not usually
result in meaningful results with Type
2000 Aug 12
1
Nonlinear regression question
Dear R users
I recently migrated from Statistica/SigmaPlot (Windows) to R (Linux), so
please excuse if this may sound 'basic'.
When running a nonlinear regression (V = Vmax * conc / (Ks + conc), i.e.
Michaelis-Menten) on SigmaPlot, I get the output listed below:
>>>Begin SigmaPlot Output<<<
R = 0.94860969 Rsqr = 0.89986035 Adj Rsqr = 0.89458984
Standard Error of
2004 Oct 04
3
(off topic) article on advantages/disadvantages of types of SS?
Hello. Please excuse this off-topic request, but I know that the
question has been debated in summary form on this list a number of
times. I would find a paper that lays out the advantages and
disadvantages of using different types of SS in the context of
unbalanced data in ANOVA, regression and ANCOVA, especially including
the use of different types of contrasts and the meaning of the
2005 Aug 19
1
Using lm coefficients in polyroot()
Dear useRs,
I need to compute zero of polynomial function fitted by lm. For example
if I fit cubic equation by fit=lm(y~x+I(x^2)+i(x^3)) I can do it simply
by polyroot(fit$coefficients). But, if I fit polynomial of higher order
and optimize it by stepAIC, I get of course some coefficients removed.
Then, if i have model
y ~ I(x^2) + I(x^4)
i cannot call polyroot in such way, because there is
2007 Jan 22
1
Compare effects between lm-models
Dear helpeRs,
I'm estimating a series of linear models (using lm) in which in every
new model variables are added. I want to test to what degree the new
variables can explain the effects of the variables already present in
the models. In order to do that, I simply observe wether these
effects decrease in strength and / or lose their significance.
My question is: does any of you know
2008 Nov 04
1
[OT] factorial design
Dear R Gurus:
I vaguely remember reading that if interaction was present in a
factorial design, then the main effect results were suspect.
However, I was reading a text which now uses the tests for main
effects even if interaction is present.
Which is correct, please?
Thanks,
Edna Bell
2011 May 21
2
unbalanced anova with subsampling (Type III SS)
Hello R-users,
I am trying to obtain Type III SS for an ANOVA with subsampling. My design
is slightly unbalanced with either 3 or 4 subsamples per replicate.
The basic aov model would be:
fit <- aov(y~x+Error(subsample))
But this gives Type I SS and not Type III.
But, using the drop() option:
drop1(fit, test="F")
I get an error message:
"Error in
2008 May 27
1
lm() output with quantiative predictors not the same as SAS
I am trying to use R lm() with quantitative and qualitative predictors, but am
getting different results than those that I get in SAS.
In the R ANOVA table documentation I see that "Type-II tests corresponds to the
tests produced by SAS for analysis-of-variance models, where all of the
predictors are factors, but not more generally (i.e., when there are
quantitative predictors)." Is
2010 Aug 31
1
anova and lm results differ
Dear all
I have found that the two "equivalent" commands do not produce the same results.
1. (I wrote this command by hand, this is what I would do usually)
>summary(aov(eduyrs ~ cntry * edf, data=ESS1))
Df Sum Sq Mean Sq F value Pr(>F)
cntry 1 257 256.65 21.2251 4.243e-06 ***
edf 4 11010 2752.42 227.6296 <
2008 Jun 30
2
difference between MASS::polr() and Design::lrm()
Dear all,
It appears that MASS::polr() and Design::lrm() return the same point
estimates but different st.errs when fitting proportional odds models,
grade<-c(4,4,2,4,3,2,3,1,3,3,2,2,3,3,2,4,2,4,5,2,1,4,1,2,5,3,4,2,2,1)
score<-c(525,533,545,582,581,576,572,609,559,543,576,525,574,582,574,471,595,
557,557,584,599,517,649,584,463,591,488,563,553,549)
library(MASS)
library(Design)
2002 Mar 01
4
Type III Sum of Squares
Hi,
When doing a two-ways anova in R and comparing my same results with an SPSS
output, I noticed that R calculated type I Sum of Squares. Is it possible to
use Type III Sum of Squares?
Thanks,
S?bastien Plante
Institut des Sciences de la Mer de Rimouski (ISMER)
-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-
r-help mailing list -- Read
2017 Nov 29
0
How to extract coefficients from sequential (type 1), ANOVAs using lmer and lme
(This time with the r-help in the recipients...)
Be careful when mixing lme4 and lmerTest together -- lmerTest extends
and changes the behavior of various lme4 functions.
>From the help page for lme4-anova (?lme4::anova.merMod)
> ?anova?: returns the sequential decomposition of the contributions
> of fixed-effects terms or, for multiple arguments, model
>
2011 Apr 28
2
Re-downloading mails after migrate server
I think a good try is update the new server to latest dovecot version.
I migrate about 1.5M mailboxes from dovecot 1.0.X to Dovecot 2.0.X and no
problems (and we have a lot of users with the "leave copy on server" option
checked).
http://wiki2.dovecot.org/Migration
On Thu, Apr 28, 2011 at 9:41 AM, Pete Conkin <pete at reach.net> wrote:
> ----- Original Message ----- From:
2017 Dec 01
0
How to extract coefficients from sequential (type 1), ANOVAs using lmer and lme
Please reread my point #1: the tests of the (individual) coefficients in
the model summary are not the same as the ANOVA tests. There is a
certain correspondence between the two (i.e. between the coding of your
categorical variables and the type of sum of squares; and for a model
with a single predictor, F=t^2), but they are not the same in general.
The t-test in the model coefficients is simply
2011 Jan 21
0
Marginality rule between powers and interaction terms in lm()
Dear all,
I have a model with simple terms, quadratic effects, and interactions.
I am wondering what to do when a variable is involved in a significant
interaction and in a non-significant quadratic effect. Here is an
example
d = data.frame(a=runif(20), b=runif(20))
d$y = d$a + d$b^2
So I create both an simple effect of a and a quadratic effect of b.
m = lm(y ~ a + b + I(a^2) + I(b^2) +
2005 Apr 20
6
Anova - adjusted or sequential sums of squares?
Hi
I am performing an analysis of variance with two factors, each with two
levels. I have differing numbers of observations in each of the four
combinations, but all four combinations *are* present (2 of the factor
combinations have 3 observations, 1 has 4 and 1 has 5)
I have used both anova(aov(...)) and anova(lm(...)) in R and it gave the
same result - as expected. I then plugged this into