similar to: Speed up computing: looping through data?

Displaying 20 results from an estimated 6000 matches similar to: "Speed up computing: looping through data?"

2012 Jan 13
2
Help needed in interpreting linear models
Dear members of the R-help list, I have sent the email below to the R-SIG-ME list to ask for help in interpreting some R output of fitted linear models. Unfortunately, I haven't yet received any answers. As I am not sure if my email was sent successfully to the mailing list I am asking for help here: Dear members of the R-SIG-ME list, I am new to linear models and struggling with
2014 Jun 13
3
p values con LMER
Hola Manuel lo he tratado de hacer pero me sale Error: unexpected string constante in: "anova(a,as,test=Chisq") no tengo ni idea de por qué... Me resulta alucinante no poder contar ya con pvals.fnc. ¿Será imposible hacerse con ello? Saludos, Miguel -------------------------------------------- El vie, 13/6/14, Manuel Azcárate <mazcarategarcia en gmail.com> escribió:
2008 Nov 24
3
Is this correct?
I have to answer the following question for a homework assignment. A researcher was interested in whether people taking part in sports at university made more money after graduating, taking into account the students' GPA. They sampled 200 alumni from a large university. The variables are: income (income 10 years after graduating), sports (1 if they did sports, 0 if they did not), and GPA (the
2003 Oct 28
1
error message in simulation
Dear R-users, I am a dentist (so forgive me if my question looks stupid) and came across a problem when I did simulations to compare a few single level and two level regressions. The simulations were interrupted and an error message came out like 'Error in MEestimate(lmeSt, grps) : Singularity in backsolve at level 0, block 1'. My collegue suggested that this might be due to my codes
2012 Jul 06
2
Anova Type II and Contrasts
the study design of the data I have to analyse is simple. There is 1 control group (CTRL) and 2 different treatment groups (TREAT_1 and TREAT_2). The data also includes 2 covariates COV1 and COV2. I have been asked to check if there is a linear or quadratic treatment effect in the data. I created a dummy data set to explain my situation: df1 <- data.frame( Observation =
2014 Jun 13
3
p values con LMER
Existe discusión sobre el uso de los p-valores en modelos mixtos. Como se ha dicho antes, para mi lo más adecuado es comparar modelos mediante la función anova. Por Internet se puede encontrar un buen libro de Douglas Bates y en español, busca modelos mixtos con R de Luis Cayuela, enfocado hacia ecología, pero está muy bien El 13/06/2014 14:00, "Jorge I Velez"
2003 Jul 30
2
Comparing two regression slopes
Hello, I've written a simple (although probably overly roundabout) function to test whether two regression slope coefficients from two linear models on independent data sets are significantly different. I'm a bit concerned, because when I test it on simulated data with different sample sizes and variances, the function seems to be extremely sensitive both of these. I am wondering if
2010 Jul 07
3
Large discrepancies in the same object being saved to .RData
Hi developers, After some investigation I have found there can be large discrepancies in the same object being saved as an external "xx.RData" file. The immediate repercussion of this is the possible increased size of your .RData workspace for no apparent reason. The function and its three scenarios below highlight these discrepancies. Note that the object being returned is exactly
2000 Jul 12
2
Removing Objects from workspace
Hi all, how can I remove objects from the workspace which starts with a certain pattern, e.g. lm (lm1, lm2, lm3 etc). Wildcards won?t work. Thanks, Sven -.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.- r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html Send "info", "help", or "[un]subscribe" (in the
2004 Aug 26
1
Why terms are dropping out of an lm() model
Hi all! I'm fairly new to R and not too experienced with regression. Because of one or both of those traits, I'm not seeing why some terms are being dropped from my model when doing a regression using lm(). I am trying to do a regression on some experimental data d, which has two numeric predictors, p1 and p2, and one numeric response, r. The aim is to compare polynomial models in p1
2014 Jun 13
2
p values con LMER
Hola a todos, quería preguntaros un medio para obtener los valores p usando lmer. He tratado con pvals.fnc, que es lo que me habían recomendado, pero por algún motivo no está ya disponible etc. Ésta es la función que tengo, pero da las "t", sin los valores p. Aunque Baayen indica que valores por encima de 2 son significativos necesito saber las p. resultado = lmer(rt_ln ~ (fre_ln *
2011 Mar 11
2
insertion of a row between individuals
Can someone help with a fairly simple task? I have a data set where I would like to insert a 0 time event between individuals: what I have: VAR DATE TIME CONC COVAR 1 NOV2 0.25 10 group1 1 NOV2 0.5 20 group1 1 NOV2 1 5 group1 1 NOV2 2 1 group1 1 NOV2 3 0.1 group1 2 NOV2 0.25 10
2010 Apr 08
2
Overfitting/Calibration plots (Statistics question)
This isn't a question about R, but I'm hoping someone will be willing to help. I've been looking at calibration plots in multiple regression (plotting observed response Y on the vertical axis versus predicted response [Y hat] on the horizontal axis). According to Frank Harrell's "Regression Modeling Strategies" book (pp. 61-63), when making such a plot on new data
2007 Jul 19
3
Can I test if there are statistical significance between different rows in R*C table?
Dear friends, My R*C table is as follow: better good bad Goup1 16 71 37 Group2 0 4 61 Group3 1 6 57 Can I test if there are statistical significant between Group1 and Group2, Group2 and Group3, Group1 and Group2, taking into the multiple comparisons? The table can be set up using the following program: a<-matrix(data=c(16,71,37,0,4,61,1,6,57),nrow=3,byrow=TRUE) Thanks
2008 Jun 04
1
Comparing two regression lines
Dear R users, Suppose I have two different response variables y1, y2 that I regress separately on the same explanatory variable, x; sample sizes are n1=n2. Is it legitimate to compare the regression slopes (equal variances assumed) by using lm(y~x*FACTOR), where FACTOR gets "y1" if y1 is the response, and "y2" if y2 is the response? The problem I see here is that the
2001 Feb 23
1
as.formula and lme ( Fixed effects: Error in as.vector(x, "list") : cannot coerce to vector)
Using a formula converted with as.formula with lme leads to an error message. Same works ok with lm, and with lme and a fixed formula. # demonstrates problems with lme and as.formula demo<-data.frame(x=1:20,y=(1:20)+rnorm(20),subj=as.factor(rep(1:2,10))) demo.lm1<-lme(y~x,data=demo,random=~1|subj) print(summary(demo.lm1)) newframe<-data.frame(x=1:5,subj=rep(1,5))
2011 Nov 03
1
Reclassify string values
Hi All, Is there a simple way to convert a string such as c("A", "B" ,"C", "D") to a string of c("Group1", "Group1", "Group2", "Group2"). Naturally I could use the factor function as below but I don't like seeing that warning message (and I don't want to turn off warning messages). Perhaps a function
2005 Aug 12
1
as.formula and lme ( Fixed effects: Error in as.vector(x, "list") : cannot coerce to vector)
This is a continuing issue with the one on the list a long time ago (I couldn't find a solution to it from the web): -------------------------------------------------------------------------- > Using a formula converted with as.formula with lme leads > to an error message. Same works ok with lm, and with > lme and a fixed formula. > > # demonstrates problems with lme and
2005 Apr 20
1
negative p-values from fisher's test (PR#7801)
Full_Name: Martha Nason Version: 2.0.1 OS: Windows XP Submission from: (NULL) (137.187.154.154) I am running simulations using fisher's test on 2 x c tables and a very small p.value from fisher's test (<2.2e-16) is returned as a negative number. Code follows. > set.seed(0) > nreps.outer <-7 > pvalue.fisher <- rep(NA,nreps.outer) > > population1 <- c(
2006 Mar 16
2
DIfference between weights options in lm GLm and gls.
Dear R-List users, Can anyone explain exactly the difference between Weights options in lm glm and gls? I try the following codes, but the results are different. > lm1 Call: lm(formula = y ~ x) Coefficients: (Intercept) x 0.1183 7.3075 > lm2 Call: lm(formula = y ~ x, weights = W) Coefficients: (Intercept) x 0.04193 7.30660 > lm3 Call: