Displaying 20 results from an estimated 24 matches for "e_y".
Did you mean:
_y
2002 Mar 29
1
help with lme function
Hi all,
I have some difficulties with the lme function and so this is my problem.
Supoose i have the following model
y_(ijk)=beta_j + e_i + epsilon_(ijk)
where beta_j are fixed effects, e_i is a random effect and
epsilon_(ijk) is the error.
If i want to estimate a such model, i execute
>lme(y~vec.J , random~1 |vec .I )
where y is the vector of my data, vec.J is a factor object
2003 Oct 23
1
Variance-covariance matrix for beta hat and b hat from lme
Dear all,
Given a LME model (following the notation of Pinheiro and Bates 2000) y_i
= X_i*beta + Z_i*b_i + e_i, is it possible to extract the
variance-covariance matrix for the estimated beta_i hat and b_i hat from the
lme fitted object?
The reason for needing this is because I want to have interval prediction on
the predicted values (at level = 0:1). The "predict.lme" seems to
2010 Feb 05
3
metafor package: effect sizes are not fully independent
In a classical meta analysis model y_i = X_i * beta_i + e_i, data
{y_i} are assumed to be independent effect sizes. However, I'm
encountering the following two scenarios:
(1) Each source has multiple effect sizes, thus {y_i} are not fully
independent with each other.
(2) Each source has multiple effect sizes, each of the effect size
from a source can be categorized as one of a factor levels
2006 Nov 21
3
Fitting mixed-effects models with lme with fixed error term variances
Dear R users,
I am writing to you because I have a few question on how to fix
the error term variances in lme in the hope that you could help me. To
my knowledge, the closest possibility is to fix the var-cov structure,
but not the whole var-cov matrix. I found an old thread (a few years
ago) about this, and it seems that the only alternative is to write the
likelihood down and use optim or a
2003 Mar 29
1
Goodness of fit tests
I have a dataset which I want to model using a Poisson distribution, with a given parameter. I would like to know what is the proper way to do a ''goodness of fit'' test using R.
I know the steps I''d take if I were to do it ''manually'': grouping the numbers into classes, calculating the expected frequencies using ''ppois'', then
2006 Feb 10
1
Lmer with weights
Hello!
I would like to use lmer() to fit data, which are some estimates and
their standard errors i.e kind of a "meta" analysis. I wonder if weights
argument is the right one to use to include uncertainty (standard
errors) of "data" into the model. I would like to use lmer(), since I
would like to have a "freedom" in modeling, if this is at all possible.
For
2004 Apr 09
1
loess' robustness weights in loess
hi!
i want to change the "robustness weights" used by loess. these
are described on page 316 of chambers and hastie's "statistical models in S"
book as
r_i = B(e_i,6m)
where B is tukey's biweight function, e_i are the residulas, and m is the
median average distance from 0 of the residuals. i want to
change 6m to, say, 3m.
is there a way to do this? i cant
2012 Jun 28
0
How to calculate Confidence Interval for a prediction using Partial Regression?
...highly correlated variables (y and x), and both of them depend
on a third variable (A, for Area). Multiple regression (y=a+(b*x)+(c*A))
would have collinearity problems, so I decided to do a partial regression
to predict y. I did it this way:
- I regressed y to A, and calculated the residuals (e_y) (reg1)
- I regressed x to A, and calculated the residuals (e_x) (reg2)
- I regressed e_y to e_x (reg5)
It looks like this:
y = a_0 + a_1 A (reg1)
x = b_0 + b_1 A (reg2)
e_y = y - (a_0 + a_1 A) (3)
e_x = x - (b_0 + b_1 A) (4)
e_y = beta_0 + beta_1 e_x (reg5)
Then, to predict a y_0 from a new...
2007 Jun 14
0
random effects in logistic regression (lmer)-- identification question
Hello R users!
I've been experimenting with lmer to estimate a mixed model with a
dichotomous dependent variable. The goal is to fit a hierarchical
model in which we compare the effect of individual and city-level
variables. I've run up against a conceptual problem that I expect one
of you can clear up for me.
The question is about random effects in the context of a model fit
with a
2008 Jul 31
1
clustering and data-mining...
Hi all,
I am doing some experiment studies...
It seems to me that with different combination of 5 parameters, the end
results ultimately converged to two scalars. That's to say, some
combinations of the 5 parameters lead to one end result and some other
combinations of the 5 parameters lead to the other end result (scalar).
I am thinking of this is sort of something like clustering or
2005 Mar 28
1
mixed model question
I am trying to fit a linear mixed model of the form
y_ij = X_ij \beta + delta_i + e_ij
where e_ij ~N(0,s^2_ij) with s_ij known
and delta_i~N(0,tau^2)
I looked at the ecme routine in package:pan, but this routine
does not allow for different Vi (variance covariance matrix of
the e_i vector) matrices for each cluster.
Is there an easy way to fit this model in R or should I bite the
bullet and
2009 Sep 24
0
basic cubic spline smoothing (resending because not sure about pending)
Hello, I come from a non statistics background, but R is available to me,
and I needed to test an implementation of smoothing spline that I have
written in c++, so I would like to match the results with R (for my unit
tests).
I am following Smoothing Splines, D.G. Pollock (available online)
where we have a list of points (xi, yi), the yi points are random such that:
y_i = f(x_i) + e_i
2007 Apr 15
1
Use estimated non-parametric model for sensitivity analysis
Dear all,
I fitted a non-parametric model using GAM function in R. i.e.,
gam(y~s(x1)+s(x2)) #where s() is the smooth function
Then I obtained the coefficients(a and b) for the non-parametric terms. i.e.,
y=a*s(x1)+b*s(x2)
Now if I want to use this estimated model to do optimization or sensitivity analysis, I am not sure how to incorporate the smooth function since s() may not
2009 Sep 24
1
basic cubic spline smoothing
Hello,
I come from a non statistics background, but R is available to me,
and I needed to test an implementation of smoothing spline that I have
written in c++, so I would like to match the results with R (for my unit
tests)
I am following
http://www.nabble.com/file/p25569553/SPLINES.PDF SPLINES.PDF
where we have a list of points (xi, yi), the yi points are random such that:
y_i = f(x_i) +
2013 Apr 22
3
Scatterplot and Causality
Dear All,
I hope this is not too off topic.
I am given a set of scatteplots (nothing too fancy; think about a
normal x-y 2D plot).
I do not deal with two time series (indeed I have no info about time).
If I call A=(A1,A2,...) and B=(B1, B2, ...) the 2 variables (two
vectors of numbers most of the case, but sometimes they can be
categorical variables), I can plot one against the other and I
2011 Jun 07
2
gam() (in mgcv) with multiple interactions
Hi! I'm learning mgcv, and reading Simon Wood's book on GAMs, as recommended to me earlier by some folks on this list. I've run into a question to which I can't find the answer in his book, so I'm hoping somebody here knows.
My outcome variable is binary, so I'm doing a binomial fit with gam(). I have five independent variables, all continuous, all uniformly
2004 Apr 05
3
2 lme questions
Greetings,
1) Is there a nice way of extracting the variance estimates from an lme fit? They don't seem to be part of the lme object.
2) In a series of simulations, I am finding that with ML fitting one of my random effect variances is sometimes being estimated as essentially zero with massive CI instead of the finite value it should have, whilst using REML I get the expected value. I guess
2007 Mar 17
1
Correlated random effects in lme
Hello,
I am interested in estimating this type of random effects panel:
y_it = x'_it * beta + u_it + e_it
u_it = rho * u_it-1 + d_it rho belongs to (-1, 1)
where:
u and e are independently normally zero-mean distributed.
d is also independently normally zero-mean distributed.
So, I want random effects for group i to be correlated in t, following an
AR(1) process.
Any idea of how
2012 Oct 28
1
Best fitted curve
Hi I have trouble making a best fitted curve for a xy-plot. My data consist
of two groups with four repititions for each x-value.
plot(weight~gdd,data=weight,pch=as.numeric(species))
<http://r.789695.n4.nabble.com/file/n4647692/Weight.jpeg>
Can you help?
Cecilie
--
View this message in context: http://r.789695.n4.nabble.com/Best-fitted-curve-tp4647692.html
Sent from the R help
2006 May 20
1
(PR#8877) predict.lm does not have a weights argument for newdata
Dear R developers,
I am a little disappointed that my bug report only made it to the
wishlist, with the argument:
Well, it does not say it has.
Only relevant to prediction intervals.
predict.lm does calculate prediction intervals for linear models from
weighted regression, so they should be correct, right?
As far as I can see they are bound to be wrong in almost all cases, if
no weights