Displaying 20 results from an estimated 30000 matches similar to: "test for difference in variance"
2007 May 29
2
hierarhical cluster analysis of groups of vectors
I want to do hierarchical cluster analysis to compare 10 groups of
vectors with five vectors in each group (i.e. I want to make a dendogram
showing the clustering of the different groups). I've looked into using
dist and hclust, but cannot see how to compare the different groups
instead of the individual vectors. I am thankful for any help.
Anders
2007 May 29
1
Fw: hierarhical cluster analysis of groups of vectors
Hi Rafael,
What about multivariate logistic regression?
----- Forwarded Message ----
From: Rafael Duarte <rduarte@ipimar.pt>
To: Anders Malmendal <anders@chem.au.dk>
Cc: r-help@stat.math.ethz.ch
Sent: Tuesday, May 29, 2007 3:21:11 PM
Subject: Re: [R] hierarhical cluster analysis of groups of vectors
It seems that you have already groups defined.
Discriminant analysis would probably
2012 Mar 09
0
pdMat class in LME to mimic SAS proc mixed group option? Group-specific random slopes
I would like to be able to use lme to fit random effect models In which some but not all of the random effects are constrained to be independent. It seems as thought the pdMat options in lme are a promising avenue. However, none of the existing pdMat classes seem to allow what I want.
As a specific example, I would like to fit a random intercept/slope mixed model to longitudinal observations in
2007 May 15
3
aov problem
I am using R to make two-way ANOVA on a number of variables using
g <- aov(var ~ fact1*fact2)
where var is a matrix containing the variables.
However the outcome seem to be dependent on the order of fact1 and fact2
(i.e. fact2*fact1) gives a slightly (factor of 1.5) different result.
Any ideas why this is?
Thanks for any help
Anders
2012 Jun 03
0
multiple variance structure in lmer giving zero variances
Hi all,
I’m hoping someone might be able to help me out. Forgive me if my mistake
is something simple. I am new to mixed models, new to R, and new to lme4
and am struggling to figure everything out. I have two questions that I am
hoping someone can answer.
1) Am I using the correct random structure for my model?
2) Can someone help me figure out what is wrong with my syntax to code for
random
2006 Nov 09
2
Meta-regression with lmer() ? If so, how ?
Dear List,
I am (again) looking at meta-regression as a way to refine meta-analytic
results. What I want to do is to assess the impact of some fixed factors
on the results of a meta-analysis. Some of them may be crossed with the
main factor of the meta-analysis (e. g. clinical presentation of a
disease, defining subgroups in each of the studies under analysis), some
of them may be a grouping
2004 Dec 19
1
Homogeneity of variance tests between more than 2 samples (long)
Dear all
a couple of months ago i've found threads regard test that verify AnOVa
assumption on homogeneity of variances. Prof. Ripley advice LDA / QDA
procedures, many books (and many proprietary programs) advice Hartley's F_max,
Cochran's minimum/maximum variance ratio (only balanced experiments), K^2
Bartlett's test, Levene's test.
Morton B. Brown and Alan B. Forsythe in a
2010 Oct 04
1
Fixed variance structure for lme
I have a data set with 50 different x values and 5 values for the sampling
variance; each of the 5 sampling variances corresponds to 10 particular x
values. I am trying to fit a mixed effect linear model and I'm not sure
about the syntax for specifying the fixed variance structure. In Pinheiro's
book my situation appears to be similar to the example used for varIdent,
where there is a
2003 Sep 23
1
what does the sum of square of Gaussian RVs with different variance obey?
>From basic statistics principle,we know,given several i.i.d Gaussian RVs with zero or nonzero mean,the sum of square of them is a central or noncentral Chi-distributed RV.However if these Gaussian RVs have different variances,what does the sum of square of them obey?
Thanks in advance.
2010 Sep 18
1
modeling variance heterogeneity in lme4
Hi all,
I have major heterogeneity in variances across labs (100-fold). There is no
apparent variance heterogeneity across y-hat. By using lme4 in the following
way, am I accounting for the variance differences in labs?:
lmer(y ~ fixed1 + covariates + (fixed1|labs))
I'm not sure that it is - I think it is only allowing the means (slopes
[conditional means] & intercepts) to differ
2004 Sep 21
1
lme RE variance computation
As I understand it lme (in R v1.9.x) estimates random effect variances
on a log scale, constraining them to be positive. Whilst this seems
sensible, it does lead to apparently biased estimates if the variance is
actually zero - which makes our simulation results look strange. Whilst
we need to think a bit deeper about it - I still haven't got my head
around what a negative variance could
2007 Aug 06
1
variance
hello,
I wanna calculate some variances for the bartlett test and I used the var() function but I think it isn't a good estimation for the variance I don't understand why.
In fact when I calculate the variances by myself I don't find the same results.
Do you know some details about this?
Thanks.
_____________________________________________________________________________
l
2012 Nov 27
0
Variance component estimation in glmmPQL
Hi all,
I've been attempting to fit a logistic glmm using glmmPQL in order to
estimate variance components for a score test, where the model is of the
form logit(mu) = X*a+ Z1*b1 + Z2*b2. Z1 and Z2 are actually reduced rank
square root matrices of the assumed covariance structure (up to a constant)
of random effects c1 and c2, respectively, such that b1 ~ N(0,sig.1^2*I) and
c1 ~
2006 Mar 28
2
Welch test for equality of variance
Hello
Using R 2.2.1 on a Windows machine.
Has anyone programmed the Welch test for equality of variances?
I tried RSiteSearch, but this gave references to t test and
oneway.test, which are not quite what I need.....I need the Welch test
itself, for use in a meta-analysis (to determine if variances are
equal).
TIA
Peter
Peter L. Flom, PhD
Assistant Director, Statistics and Data Analysis
2012 Jun 04
0
Negative variance with lavaan in a multigroup analysis.
Hi list members,
I saw a couple lavaan posts here so I think I?m sending this to the
correct list.
I am trying to run a multigroup analysis with lavaan in order to
compare behavioural correlations across two populations. I?m following
the method suggested in the paper by Dingemanse et al. (2010) in
Behavioural Ecology.
In one of the groups, lavaan returns negative variance for one path
and I?m
2009 Jan 11
3
summary with variance / sd
Hi,
I have a data frame and would like to have summary statistics for
grouped data.
With summary() I get the central tendencies for the overall data.
How can I get descriptive statistics with variances and standard
deviations?
for example my data.frame:
group x y
exp 2 4
exp 3 5
exp 2 4
control 1 2
control 2 3
control 1 2
now I want tables with summary statistics (variances
2010 Jul 02
2
K-means result - variance between cluster
Hi,
I like to present the results from the clustering method k-means in
terms of variances: within and between Cluster. The k-means object
gives only the within cluster sum of squares by cluster, so the between
variance part is missing,for calculation the following table, which I
try to get.
Number of | Variance within | Var between | Var total | F-value
Cluster k | cluster | cluster
2007 Jan 03
1
mcmcsamp and variance ratios
Hi folks,
I have assumed that ratios of variance components (Fst and Qst in
population genetics) could be estimated using the output of mcmcsamp
(the series on mcmc sample estimates of variance components).
What I have started to do is to use the matrix output that included
the log(variances), exponentiate, calculate the relevant ratio, and
apply either quantile or or HPDinterval to get
2002 Jul 01
1
Defining own variance function / quasi-likelihood in a GLM
Hello,
I've been looking in the on-line manuals and searching past posts but
can't find an answer to this question.
I'd like to define my own variance function in a GLM.
The function glm(formula, family=quasi(var="var function"))
lets me choose from a selection of built in variances, but I want to
define my own function for the variance.
Is there an S-plus
2005 Jan 04
0
boot and variances of the bootstrap replicates of the variable of interest?
I want to use boot.ci to generate confidence intervals over the
bootstrapped mean(s) of a group of observations (i.e. I have 10
observations and I want to know how confident I can be on the value for
the mean).
I don't know (or want to know) the details of bootstrapping - I just have
the simplistic idea of taking samples, measuring a statistic on the
sample, and getting some confidence in the