Displaying 20 results from an estimated 20000 matches similar to: "Simple question regarding to specifying model"
2005 Dec 09
1
Residuals from GLMMs in the lme4 package
Hello there
This is the first time I have used r-help message board so I hope I have got
the right address.
I am trying to check the residuals of a GLMM model(run using the package
lme4). I have been able to check the residiuals of REMLs in lme4 using the
following:
m1<-lmer(vTotal~Week+fCollar+ (1|fCat), collars)
res<-resid(m1)
plot(res)
qqnorm(res)
library(MASS)
par(mfrow=c(2,3))
2006 Jan 27
2
lme4_0.995-2/Matrix_0.995-4 upgrade introduces error messages (change management)
I'll address two issues. The first is today's error message and the other is change management for contributed packages on CRAN.
TODAY'S ERROR MESSAGE
I switched from the 0.995-1 versions of lme4 and Matrix to those referenced in the subject line this afternoon. Prior to using these packages on anything else, I applied them to code that 'worked' (provided numerical results
2006 Sep 19
0
How to interpret these results from a simple gamma-frailty model
Dear R users,
I'm trying to fit a gamma-frailty model on a simulated dataset, with 6 covariates, and I'm running into some results I do not understand. I constructed an example from my simulation code, where I fit a coxph model without frailty (M1) and with frailty (M2) on a number of data samples with a varying degree of heterogeneity (I'm running R 2.3.1, running takes ~1 min).
2007 Jun 22
1
two basic question regarding model selection in GAM
Qusetion #1
*********
Model selection in GAM can be done by using:
1. step.gam {gam} : A directional stepwise search
2. gam {mgcv} : Smoothness estimation using GCV or UBRE/AIC criterion
Suppose my model starts with a additive model (linear part + spline part).
Using gam() {mgcv} i got estimated degrees of freedom(edf) for the smoothing
splines. Now I want to use the functional form of my model
2004 Oct 06
1
odd behavior of summary()$r.squared
I may be missing something obvious here, but consider the following simple
dataset simulating repeated measures on 5 individuals with pretty strong
between-individual variance.
set.seed(1003)
n<-5
v<-rep(1:n,each=2)
d<-data.frame(factor(v),v+rnorm(2*n))
names(d)<-c("id","y")
Now consider the following two linear models that provide identical fitted
values,
2008 Jan 31
1
difficulties computing a simple anova
My grasp of R and statistics are both seriously lacking, so if this question
is completely naive, I apologize in advance. I've hunted for a couple hours
on the internet and none of the methods I've found have produced the result
I'm looking for.
I'm currently a student in a Statistics class and we are learning the ANOVA.
We had to do one by hand and then reproduce our work in SAS.
2011 May 24
3
test de Friedman , con comparación planificada simple (la primera contra el resto...).
Hola.
Hay alguna función que haga un Friedman test (digamos 4 tratamientos o
tiempos relacionados/dependientes) y que después haga una comparación
de un tratamiento contra el resto, digamos el primero, como un
contratase simple, o un Dunnett?
o simplemente ¿como hago un Dunnett para unos tratamientos relacionados?
--
Antonio M
[[alternative HTML version deleted]]
2009 Mar 01
1
Understanding Anova (car) output
Dear professor Fox and R helpers,
I have a quick question about the Anova function in the car package.
When using the default "type II" SS I get results that I don't
understand (see below).
library(car)
Data <- data.frame(y=rnorm(10), x1=factor(c(rep("a",4), rep("b",6))),
x2 = factor(c(rep("j", 2), rep("k", 3), rep("j", 2),
2007 Apr 04
2
Newbie: Simple loops: complex troubles
I am used to java (well, i dont remember it really well, but anyway)
I have having a really difficult time making simple loops to work. I got the
following to work:
##
##Creates objects Ux1, Ux2, Ux2 etc. that all contain n numbers in a
random distribution
##
m<-c(m1,m2,m3,m4,m5,m6,m7,m8,m9,m10)#these are defined as numbers (means)
v<-c(v1,v2,v3,v4,v5,v6,v7,v8,v9,v10)#these
2017 Jan 10
0
[bug report] drm/nouveau/devinit: move simple pll setting routines to devinit
Hello Ben Skeggs,
The patch 88524bc06926: "drm/nouveau/devinit: move simple pll setting
routines to devinit" from Mar 5, 2013, leads to the following static
checker warning:
drivers/gpu/drm/nouveau/nvkm/subdev/devinit/nv50.c:53 nv50_devinit_pll_set()
info: return a literal instead of 'ret'
drivers/gpu/drm/nouveau/nvkm/subdev/devinit/nv50.c
34 int
35
2012 Jul 13
4
R-squared with Intercept set to 0 (zero) for linear regression in R is incorrect
Hi,
I have been using lm in R to do a linear regression and find the slope
coefficients and value for R-squared. The R-squared value reported by R
(R^2 = 0.9558) is very different than the R-squared value when I use the
same equation in Exce (R^2 = 0.328). I manually computed R-squared and the
Excel value is correct. I show my code for the determination of R^2 in R.
When I do not set 0 as the
2012 Apr 20
1
predictOMatic for regression. Please try and advise me
I'm pasting below a working R file featuring a function I'd like to polish up.
I'm teaching regression this semester and every time I come to
something that is very difficult to explain in class, I try to
simplify it by writing an R function (eventually into my package
"rockchalk"). Students have a difficult time with predict and newdata
objects, so right now I'm
2008 Mar 07
1
Finding Interaction and main effects contrasts for two-way ANOVA
I've tried without success to calculate interaction and main effects
contrasts using R. I've found the functions C(), contrasts(),
se.contrasts() and fit.contrasts() in package gmodels. Given the url
for a small dataset and the two-way anova model below, I'd like to
reproduce the results from appended SAS code. Thanks. --Dale.
## the dataset (from Montgomery)
twoway <-
2006 Dec 06
3
intercept value in lme
Dear all,
I've got a problem in fitting multilevel model in lme. I don't know to
much about that but suspect that something is wrong with my model.
I'm trying to fit:
m1<-lme(X~Y,~1|group,data=data,na.action=na.exclude,method="ML")
m2<-lme(X~Y+Z,~1|group,data=data,na.action=na.exclude,method="ML")
where:
X - dependent var. measured on a scale ranging from
2008 May 11
1
Fundamental formula and dataframe question.
There is a very useful and apparently fundamental feature of R (or of
the package pls) which I don't understand.
For datasets with many independent (X) variables such as chemometric
datasets there is a convenient formula and dataframe construction that
allows one to access the entire X matrix with a single term.
Consider the gasoline dataset available in the pls package. For the
model
2006 Apr 23
2
distribution of the product of two correlated normal
Hi,
Does anyone know what the distribution for the product of two correlated
normal? Say I have X~N(a, \sigma1^2) and Y~N(b, \sigma2^2), and the
\rou(X,Y) is not equal to 0, I want to know the pdf or cdf of XY. Thanks
a lot in advance.
yu
[[alternative HTML version deleted]]
2006 Mar 13
3
hfsc and dropped packets
Hi,
I''m trying to get a handle on hfsc. Here is my configuration:
root@jmnrouter:/jmn# tc class show dev vlan1
class hfsc 1: root
class hfsc 1:1 parent 1: ls m1 0bit d 0us m2 225000bit ul m1 0bit d 0us m2
225000bit
class hfsc 1:10 parent 1:1 rt m1 191000bit d 25.0ms m2 135000bit ls m1 0bit
d 0us m2 135000bit ul m1 0bit d 0us m2 225000bit
class hfsc 1:20 parent 1:1 rt m1 22008bit d
2005 Aug 02
1
simplifying a lmer model
I have been using lmer in lme4() to analyse the effect of dropping a term from
the model as below
anova.name<-anova(m1,m2)
Where m1 is an original model and m2 has one term removed. I can then create
my own type III tables for each variable in the model. However with many
variables this becomes exceedingly time consuming!
anova.lme in nlme() and Anova() in car dont seem to work for lme
2010 Jun 16
2
data frame
Dear list,
I have the following problem. I have a data frame like this
CLUSTER YEAR variable Delta R_pivot
M1 2005 EC01 NA NA
M1 2006 EC01 2 NA
M1 2007 EC01 4 5
M2 2005
2010 Jul 20
4
Correct statistical inference for linear regression models without intercept in R
Dear R community,
is there a way to get correct t- and p-values and R squared for linear
regression models specified without an intercept?
example model:
summary(lm(y ~ 0 + x))
This gives too low p-values and too high R squared. Is there a way to
correct it? Or should I specify with intercept to get the correct values?
Thank you in advance!
Wojtek Musial
--
View this message in context: