Displaying 20 results from an estimated 31 matches for "exegeses".
2008 Sep 14
3
Nonlinear regression question&In-Reply-To=6rya22mljx.fsf@franz.stat.wisc.edu
I was unable to open this file Bill Venables' excellent "Exegeses on
Linear Models" posted at
http://www.stats.ox.ac.uk/pub/MASS3/Exegeses.ps.gz I'd be very
interested in reading it?
Thanks
Esther Meenken
Biometrician
Crop & Food Research
Private Bag 4704
Christchurch
TEL: (03) 325 9639
FAX: (03) 325 2074
EMAIL:MeenkenE at crop.cri.nz
Vis...
2000 Aug 12
1
Nonlinear regression question
Dear R users
I recently migrated from Statistica/SigmaPlot (Windows) to R (Linux), so
please excuse if this may sound 'basic'.
When running a nonlinear regression (V = Vmax * conc / (Ks + conc), i.e.
Michaelis-Menten) on SigmaPlot, I get the output listed below:
>>>Begin SigmaPlot Output<<<
R = 0.94860969 Rsqr = 0.89986035 Adj Rsqr = 0.89458984
Standard Error of
2008 May 25
1
marginality principle / selecting the right type of SS for an interaction hypothesis
Hello,
I have a problem with selecting the right type of sums of squares for
an ANCOVA for my specific experimental data and hypotheses. I do have
a basic understanding of the differences between Type-I, II, and III
SSs, have read about the principle of marginality, and read Venable's
"Exegeses on Linear Models"
(http://www.stats.ox.ac.uk/pub/MASS3/Exegeses.pdf). I am pretty new to R
and a search of the R-help archive did not
answer my question (although I found some good pointers).
In brief, leaving my covariates aside, I hypothesize that women (a)
generally perform lower then me...
2009 Dec 15
1
Type III sum of square in ANOVA
Dear all,
Does some body have idea on how to extract Type III sum of Square from "lm" or "aov" function in R ? I could not figure out it.
If this is minor and irrelevant to post in this mail, I am sorry for that.
Thanks.
Sincerely,
Ram Kumar Basnet
Wageningen University,
Netherlands
[[alternative HTML version deleted]]
2008 Jun 03
1
Model simplification using anova()
Hello all,
I've become confused by the output produced by a call to
anova(model1,model2). First a brief background. My model used to predict
final tree height is summarised here:
Df Sum Sq Mean Sq F value Pr(>F)
Treatment 2 748.35 374.17 21.3096 7.123e-06 ***
HeightInitial 1 0.31 0.31 0.0178 0.89519
2007 May 15
3
aov problem
I am using R to make two-way ANOVA on a number of variables using
g <- aov(var ~ fact1*fact2)
where var is a matrix containing the variables.
However the outcome seem to be dependent on the order of fact1 and fact2
(i.e. fact2*fact1) gives a slightly (factor of 1.5) different result.
Any ideas why this is?
Thanks for any help
Anders
2007 Oct 11
2
Type III sum of squares and appropriate contrasts
I am running a two-way anova with Type III sums of squares and would
like to be able to understand what the different SS mean when I use
different contrasts, e.g. treatment contrasts vs helmert contrasts. I
have read John Fox's "An R and S-Plus Companion to Applied Regression"
approach -p. 140- suggesting that treatment contrasts do not usually
result in meaningful results with Type
2017 Nov 29
0
How to extract coefficients from sequential (type 1), ANOVAs using lmer and lme
...type = 1 (The definition comes from SAS theory)
So lmerTest-anova by default gives you Type III ('marginal', although
Type II is what actually gives you tests that respect the Principle of
Marginality; see John Fox's Applied Regression Analysis (book) or
Venables' "Exegeses on Linear Models"
(https://www.stats.ox.ac.uk/pub/MASS3/Exegeses.pdf) for more information
on that. Type I tests are the sequential tests, so with anova(model,
type=1), you will get the sequential tests you want. lmerTest will
approximate the denominator degrees of freedom for you (using
Satte...
2024 Aug 07
1
Manually calculating values from aov() result
Dear Brian,
As Duncan mentioned, the terms type-I, II, and III sums of squares
originated in SAS. The type-II and III SSs computed by the Anova()
function in the car package take a different computational approach than
in SAS, but in almost all cases produce the same results. (I slightly
regret using the "type-*" terminology for car::Anova() because of the
lack of exact
2004 Oct 04
3
(off topic) article on advantages/disadvantages of types of SS?
Hello. Please excuse this off-topic request, but I know that the
question has been debated in summary form on this list a number of
times. I would find a paper that lays out the advantages and
disadvantages of using different types of SS in the context of
unbalanced data in ANOVA, regression and ANCOVA, especially including
the use of different types of contrasts and the meaning of the
2017 Dec 01
0
How to extract coefficients from sequential (type 1), ANOVAs using lmer and lme
...ition comes from SAS theory)
>
>
> So lmerTest-anova by default gives you Type III ('marginal', although
> Type II is what actually gives you tests that respect the Principle of
> Marginality; see John Fox's Applied Regression Analysis (book) or
> Venables' "Exegeses on Linear Models"
> (https://www.stats.ox.ac.uk/pub/MASS3/Exegeses.pdf) for more information
> on that. Type I tests are the sequential tests, so with anova(model,
> type=1), you will get the sequential tests you want. lmerTest will
> approximate the denominator degrees of freedom...
2008 Jun 30
2
difference between MASS::polr() and Design::lrm()
Dear all,
It appears that MASS::polr() and Design::lrm() return the same point
estimates but different st.errs when fitting proportional odds models,
grade<-c(4,4,2,4,3,2,3,1,3,3,2,2,3,3,2,4,2,4,5,2,1,4,1,2,5,3,4,2,2,1)
score<-c(525,533,545,582,581,576,572,609,559,543,576,525,574,582,574,471,595,
557,557,584,599,517,649,584,463,591,488,563,553,549)
library(MASS)
library(Design)
2005 Aug 19
1
Using lm coefficients in polyroot()
Dear useRs,
I need to compute zero of polynomial function fitted by lm. For example
if I fit cubic equation by fit=lm(y~x+I(x^2)+i(x^3)) I can do it simply
by polyroot(fit$coefficients). But, if I fit polynomial of higher order
and optimize it by stepAIC, I get of course some coefficients removed.
Then, if i have model
y ~ I(x^2) + I(x^4)
i cannot call polyroot in such way, because there is
2007 Jan 22
1
Compare effects between lm-models
Dear helpeRs,
I'm estimating a series of linear models (using lm) in which in every
new model variables are added. I want to test to what degree the new
variables can explain the effects of the variables already present in
the models. In order to do that, I simply observe wether these
effects decrease in strength and / or lose their significance.
My question is: does any of you know
2008 Nov 04
1
[OT] factorial design
Dear R Gurus:
I vaguely remember reading that if interaction was present in a
factorial design, then the main effect results were suspect.
However, I was reading a text which now uses the tests for main
effects even if interaction is present.
Which is correct, please?
Thanks,
Edna Bell
2011 Jan 21
0
Marginality rule between powers and interaction terms in lm()
...action or the
quadratic terms:
drop1(m)
...
Df Sum of Sq RSS AIC
<none> 0.000033 -254.306
I(a^2) 1 0.000000 0.000033 -256.306
I(b^2) 1 0.098611 0.098644 -96.239
a:b 1 0.000032 0.000065 -242.674
However, this: http://www.stats.ox.ac.uk/pub/MASS3/Exegeses.pdf
suggests that marginality rules between powers of variables might not
be implemented (although they might have been since 2000).
My question is: I am "allowed", according to marginality rules, to remove a^2?
I have found plenty of information on how the coefficients
corresponding to...
2017 Sep 30
0
Converting SAS Code
And appropriatesly
> library(fortunes) > fortune()
SAS seems to be to statistical computing what Microsoft is to personal
computing.
?? -- Bill Venables
????? 'Exegeses on Linear Models' paper (May 2000)
On Saturday, September 30, 2017, 4:57:23 PM EDT, Rolf Turner <r.turner at auckland.ac.nz> wrote:
On 01/10/17 01:22, Robert Baer wrote:
>
>
> On 9/29/2017 3:37 PM, Rolf Turner wrote:
>> On 30/09/17 07:45, JLucke at ria.buffalo....
2011 May 21
2
unbalanced anova with subsampling (Type III SS)
Hello R-users,
I am trying to obtain Type III SS for an ANOVA with subsampling. My design
is slightly unbalanced with either 3 or 4 subsamples per replicate.
The basic aov model would be:
fit <- aov(y~x+Error(subsample))
But this gives Type I SS and not Type III.
But, using the drop() option:
drop1(fit, test="F")
I get an error message:
"Error in
2008 May 27
1
lm() output with quantiative predictors not the same as SAS
I am trying to use R lm() with quantitative and qualitative predictors, but am
getting different results than those that I get in SAS.
In the R ANOVA table documentation I see that "Type-II tests corresponds to the
tests produced by SAS for analysis-of-variance models, where all of the
predictors are factors, but not more generally (i.e., when there are
quantitative predictors)." Is
2010 Aug 31
1
anova and lm results differ
Dear all
I have found that the two "equivalent" commands do not produce the same results.
1. (I wrote this command by hand, this is what I would do usually)
>summary(aov(eduyrs ~ cntry * edf, data=ESS1))
Df Sum Sq Mean Sq F value Pr(>F)
cntry 1 257 256.65 21.2251 4.243e-06 ***
edf 4 11010 2752.42 227.6296 <