similar to: Is this correct?

Displaying 20 results from an estimated 4000 matches similar to: "Is this correct?"

2007 Dec 07
1
AIC v. extractAIC
Hello, I am using a simple linear model and I would like to get an AIC value. I came across both AIC() and extractAIC() and I am not sure which is best to use. I assumed that I should use AIC for a glm and extractAIC() for lm, but if I run my model in glm the AIC value is the same if I use AIC() on an lm object. What might be going on? Did I interpret these functions incorrectly? Thanks,
2003 Oct 28
1
error message in simulation
Dear R-users, I am a dentist (so forgive me if my question looks stupid) and came across a problem when I did simulations to compare a few single level and two level regressions. The simulations were interrupted and an error message came out like 'Error in MEestimate(lmeSt, grps) : Singularity in backsolve at level 0, block 1'. My collegue suggested that this might be due to my codes
2014 Jun 13
3
p values con LMER
Hola Manuel lo he tratado de hacer pero me sale Error: unexpected string constante in: "anova(a,as,test=Chisq") no tengo ni idea de por qué... Me resulta alucinante no poder contar ya con pvals.fnc. ¿Será imposible hacerse con ello? Saludos, Miguel -------------------------------------------- El vie, 13/6/14, Manuel Azcárate <mazcarategarcia en gmail.com> escribió:
2014 Jun 13
2
p values con LMER
Hola a todos, quería preguntaros un medio para obtener los valores p usando lmer. He tratado con pvals.fnc, que es lo que me habían recomendado, pero por algún motivo no está ya disponible etc. Ésta es la función que tengo, pero da las "t", sin los valores p. Aunque Baayen indica que valores por encima de 2 son significativos necesito saber las p. resultado = lmer(rt_ln ~ (fre_ln *
2012 Jul 06
2
Anova Type II and Contrasts
the study design of the data I have to analyse is simple. There is 1 control group (CTRL) and 2 different treatment groups (TREAT_1 and TREAT_2). The data also includes 2 covariates COV1 and COV2. I have been asked to check if there is a linear or quadratic treatment effect in the data. I created a dummy data set to explain my situation: df1 <- data.frame( Observation =
2007 Sep 05
3
'singular gradient matrix’ when using nls() and how to make the program skip nls( ) and run on
Dear friends. I use nls() and encounter the following puzzling problem: I have a function f(a,b,c,x), I have a data vector of x and a vectory y of realized value of f. Case1 I tried to estimate c with (a=0.3, b=0.5) fixed: nls(y~f(a,b,c,x), control=list(maxiter = 100000, minFactor=0.5 ^2048),start=list(c=0.5)). The error message is: "number of iterations exceeded maximum of
2012 Jan 13
2
Help needed in interpreting linear models
Dear members of the R-help list, I have sent the email below to the R-SIG-ME list to ask for help in interpreting some R output of fitted linear models. Unfortunately, I haven't yet received any answers. As I am not sure if my email was sent successfully to the mailing list I am asking for help here: Dear members of the R-SIG-ME list, I am new to linear models and struggling with
2014 Jun 13
3
p values con LMER
Existe discusión sobre el uso de los p-valores en modelos mixtos. Como se ha dicho antes, para mi lo más adecuado es comparar modelos mediante la función anova. Por Internet se puede encontrar un buen libro de Douglas Bates y en español, busca modelos mixtos con R de Luis Cayuela, enfocado hacia ecología, pero está muy bien El 13/06/2014 14:00, "Jorge I Velez"
2000 Jul 12
2
Removing Objects from workspace
Hi all, how can I remove objects from the workspace which starts with a certain pattern, e.g. lm (lm1, lm2, lm3 etc). Wildcards won?t work. Thanks, Sven -.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.- r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html Send "info", "help", or "[un]subscribe" (in the
2010 Jul 07
3
Large discrepancies in the same object being saved to .RData
Hi developers, After some investigation I have found there can be large discrepancies in the same object being saved as an external "xx.RData" file. The immediate repercussion of this is the possible increased size of your .RData workspace for no apparent reason. The function and its three scenarios below highlight these discrepancies. Note that the object being returned is exactly
2006 Mar 16
2
DIfference between weights options in lm GLm and gls.
Dear R-List users, Can anyone explain exactly the difference between Weights options in lm glm and gls? I try the following codes, but the results are different. > lm1 Call: lm(formula = y ~ x) Coefficients: (Intercept) x 0.1183 7.3075 > lm2 Call: lm(formula = y ~ x, weights = W) Coefficients: (Intercept) x 0.04193 7.30660 > lm3 Call:
2007 Jul 17
1
Speed up computing: looping through data?
Dear all, Please excuse my ignorance, but I am having difficulty with this, and am unable to find help on the website/Google. I have a series of explanatory variables that I am aiming to get parsimony out of. For example, if I have 10 variables, a-j, I am initially looking at the linear relationships amongst them: my.lm1 <- lm(a ~ b+c+d+e+f+g+h+i+j, data=my.data) summary(my.lm1) my.lm2
2004 Aug 26
1
Why terms are dropping out of an lm() model
Hi all! I'm fairly new to R and not too experienced with regression. Because of one or both of those traits, I'm not seeing why some terms are being dropped from my model when doing a regression using lm(). I am trying to do a regression on some experimental data d, which has two numeric predictors, p1 and p2, and one numeric response, r. The aim is to compare polynomial models in p1
2003 Jul 30
2
Comparing two regression slopes
Hello, I've written a simple (although probably overly roundabout) function to test whether two regression slope coefficients from two linear models on independent data sets are significantly different. I'm a bit concerned, because when I test it on simulated data with different sample sizes and variances, the function seems to be extremely sensitive both of these. I am wondering if
2007 May 17
4
R2 always increases as variables are added?
Hi, everybody, 3 questions about R-square: ---------(1)----------- Does R2 always increase as variables are added? ---------(2)----------- Does R2 always greater than 1? ---------(3)----------- How is R2 in summary(lm(y~x-1))$r.squared calculated? It is different from (r.square=sum((y.hat-mean (y))^2)/sum((y-mean(y))^2)) I will illustrate these problems by the following codes:
2011 Oct 17
1
plotting issues with PCA
Hi Listers, This has a simple answer but it has been eluding me nonetheless. I have been building a PCA plot from scratch with the ability to plot predefined groups in different colors. This has worked fine but when I try to get a polygon drawn around each of the groups it is not recognising my colour file correctly and is only printing the first colour in the file....code is below
2012 Jul 25
2
Nested Models
Hey, I'm an R noobie and I have been trying calculate SSEr and SSEc in order to determine if there is sufficient evidence to include second-order terms in my model, but I have no idea what command to use. Any help with this would be much appreciated. -- View this message in context: http://r.789695.n4.nabble.com/Nested-Models-tp4637855.html Sent from the R help mailing list archive at
2006 Sep 03
2
lm, weights and ...
> lm2 <- function(...) lm(...) > lm2(mpg ~ wt, data=mtcars) Call: lm(formula = ..1, data = ..2) Coefficients: (Intercept) wt 37.285 -5.344 > lm2(mpg ~ wt, weights=cyl, data=mtcars) Error in eval(expr, envir, enclos) : ..2 used in an incorrect context, no ... to look in Can anyone explain why this is happening? (Obviously this is a manufactured example, but it
2010 Apr 08
2
Overfitting/Calibration plots (Statistics question)
This isn't a question about R, but I'm hoping someone will be willing to help. I've been looking at calibration plots in multiple regression (plotting observed response Y on the vertical axis versus predicted response [Y hat] on the horizontal axis). According to Frank Harrell's "Regression Modeling Strategies" book (pp. 61-63), when making such a plot on new data
2008 Jun 04
1
Comparing two regression lines
Dear R users, Suppose I have two different response variables y1, y2 that I regress separately on the same explanatory variable, x; sample sizes are n1=n2. Is it legitimate to compare the regression slopes (equal variances assumed) by using lm(y~x*FACTOR), where FACTOR gets "y1" if y1 is the response, and "y2" if y2 is the response? The problem I see here is that the