similar to: Automated Fixed Order Stepwise Regression Function

Displaying 20 results from an estimated 200 matches similar to: "Automated Fixed Order Stepwise Regression Function"

2013 May 03
1
R package for bootstrapping (comparing two quadratic regression models)
Hello , I want to compare two quadratic regression models with non-parametric bootstrap. However, I do not know which R package can serve the purpose, such as boot, rms, or bootstrap, DeltaR. Please kindly advise and thank you. Elaine The two quadratic regression models are y1=a1x^2+b1x+c1 y1= observed migration distance of butterflies() y2=a2x^2+b2x+c2 y2= predicted migration distance of
2013 Nov 25
4
lmer specification for random effects: contradictory reults
Hi All, I was wondering if someone could help me to solve this issue with lmer. In order to understand the best mixed effects model to fit my data, I compared the following options according to the procedures specified in many papers (i.e. Baayen <http://www.google.it/url?sa=t&rct=j&q=&esrc=s&source=web&cd=1&ved=0CDsQFjAA
2012 Jul 30
1
te( ) interactions and AIC model selection with GAM
Hello R users, I'm working with a time-series of several years and to analyze it, I?m using GAM smoothers from the package mgcv. I?m constructing models where zooplankton biomass (bm) is the dependent variable and the continuous explanatory variables are: -time in Julian days (t), to creat a long-term linear trend -Julian days of the year (t_year) to create an annual cycle - Mean temperature
2011 Mar 19
1
strange PREDICTIONS from a PIECEWISE LINEAR (mixed) MODEL
Hi Dears, When I introduce an interaciton in a piecewise model I obtain some quite unusual results. If that would't take u such a problem I'd really appreciate an advise from you. I've reproduced an example below... Many thanks x<-rnorm(1000) y<-exp(-x)+rnorm(1000) plot(x,y) abline(v=-1,col=2,lty=2) mod<-lm(y~x+x*(x>-1)) summary(mod) yy<-predict(mod)
2003 Apr 28
2
stepAIC/lme problem (1.7.0 only)
I can use stepAIC on an lme object in 1.6.2, but I get the following error if I try to do the same in 1.7.0: Error in lme(fixed = resp ~ cov1 + cov2, data = a, random = structure(list( : unused argument(s) (formula ...) Does anybody know why? Here's an example: library(nlme) library(MASS) a <- data.frame( resp=rnorm(250), cov1=rnorm(250), cov2=rnorm(250),
2013 Nov 25
0
R: lmer specification for random effects: contradictory reults
Dear Thierry, thank you for the quick reply. I have only one question about the approach you proposed. As you suggested, imagine that the model we end up after the model selection procedure is: mod2.1 <- lmer(dT_purs ~ T + Z + (1 +T+Z| subject), data =x, REML= FALSE) According to the common procedures specified in many manuals and recent papers, if I want to compute the p_values relative to
2011 Nov 17
1
Log-transform and specifying Gamma
Dear R help, I am trying to work out if I am justified in log-transforming data and specifying Gamma in the same glm. Does it have to be one or the other? I have attached an R script and the datafile to show what I mean. Also, I cannot find a mixed-model that allows Gamma errors (so I cannot find a way of including random effects). What should I do? Many thanks, Pete --------------
2003 Feb 10
2
problems using lqs()
Dear List-members, I found a strange behaviour in the lqs function. Suppose I have the following data: y <- c(7.6, 7.7, 4.3, 5.9, 5.0, 6.5, 8.3, 8.2, 13.2, 12.6, 10.4, 10.8, 13.1, 12.3, 10.4, 10.5, 7.7, 9.5, 12.0, 12.6, 13.6, 14.1, 13.5, 11.5, 12.0, 13.0, 14.1, 15.1) x1 <- c(8.2, 7.6,, 4.6, 4.3, 5.9, 5.0, 6.5, 8.3, 10.1, 13.2, 12.6, 10.4, 10.8, 13.1, 13.3, 10.4, 10.5, 7.7, 10.0, 12.0,
2010 Sep 08
4
coxph and ordinal variables?
Dear R-help members, Apologies - I am posting on behalf of a colleague, who is a little puzzled as STATA and R seem to be yielding different survival estimates for the same dataset when treating a variable as ordinal. Ordered() is used to represent an ordinal variable) I understand that R's coxph (by default) uses the Efron approximation, whereas STATA uses (by default) the Breslow. but we
2009 Feb 03
0
lm function
Hi, I do not use R lenguage for programming, hence Im trying to understand how are specified the models below. I been reading some webpages with information about the notation in the "lm" function, but havent understand exactly what does the ":" do. Can some one give me an explenation of the models below so I can replicate the results in other program. #common intercept
2002 Dec 18
6
Can I build an array of regrssion model?
Hi, I am trying to use piecewise linear regression to approximate a nonlinear function. Actually, I don't know how many linear functions I need, therefore, I want build an array of regression models to automate the approximation job. Could you please give me any clue? Attached is ongoing code: rawData = scan("c:/zyang/mass/data/A01/1.PRN", what=list(numeric(),numeric())); len =
2004 Apr 05
3
2 lme questions
Greetings, 1) Is there a nice way of extracting the variance estimates from an lme fit? They don't seem to be part of the lme object. 2) In a series of simulations, I am finding that with ML fitting one of my random effect variances is sometimes being estimated as essentially zero with massive CI instead of the finite value it should have, whilst using REML I get the expected value. I guess
2011 Feb 05
1
very basic HLM question
Hi everyone, I need to get a between-component variance (e.g. random effects Anova), but using lmer I don't get the same results (variance component) than using random effects Anova. I am using a database of students, clustered on schools (there is not the same number of students by school). According to the ICC1 command, the interclass correlation is .44 > ICC1(anova1) [1] 0.4414491
2012 Jun 06
3
Sobel's test for mediation and lme4/nlme
Hello, Any advice or pointers for implementing Sobel's test for mediation in 2-level model setting? For fitting the hierarchical models, I am using "lme4" but could also revert to "nlme" since it is a relatively simple varying intercept model and they yield identical estimates. I apologize for this is an R question with an embedded statistical question. I noticed that a
2012 Jun 04
1
Chi square value of anova(binomialglmnull, binomglmmod, test="Chisq")
Hi all, I have done a backward stepwise selection on a full binomial GLM where the response variable is gender. At the end of the selection I have found one model with only one explanatory variable (cohort, factor variable with 10 levels). I want to test the significance of the variable "cohort" that, I believe, is the same as the significance of this selected model: >
2005 Jul 11
2
CIs in predict?
Dear All, I am trying to put some Confidence intervals on some regressions from a linear model with no luck. I can extract the fitted values using 'predict', but am having difficulty in getting at the confidence intervals, or the standard errors. Any suggestions would be welcome Cheers Guy Using Version 2.1.0 (2005-04-18) on a PC vol.mod3 <-
2009 Apr 23
3
Interpreting the results of Friedman test
Hello, I have problems interpreting the results of a Friedman test. It seems to me that the p-value resulting from a Friedman test and with it the "significance" has to be interpreted in another way than the p-value resulting from e.g. ANOVA? Let me describe the problem with some detail: I'm testing a lot of different hypotheses in my observer study and only for some the premises
2006 Nov 08
0
Mod3 Solaris Container Hosting
Has anyone tried the Solaris containers at http://www.mod3.co.uk for hosting a Rails application. How does it compare to Media Temple''s rails container? Tim Welsh --~--~---------~--~----~------------~-------~--~----~ You received this message because you are subscribed to the Google Groups "Ruby on Rails: Talk" group. To post to this group, send email to
2005 Aug 29
1
Different sings for correlations in OLS and TSA
Dear list, I am trying to re-analyse something. I do have two time series, one of which (ts.mar) might help explaining the other (ts.anr). In the original analysis, no-one seems to have cared about the data being time-series and they just did OLS. This yielded a strong positive correlation. I want to know if this correlation is still as strong when the autocorrelations are taken into account.
2005 Sep 15
1
Coefficients from LM
Hi everyone, Can anyone tell me if its possibility to extract the coefficients from the lm() command? For instance, imagine that we have the following data set (the number of observations for each company is actually larger than the one showed...): Company Y X1 X2 1 y_1 x1_1 x2_1 1 y_2 x1_2 x2_2 1 y_3 x1_3 x2_3 (...) 2 y_4 x1_4 x2_4 2 y_5 x1_5 x2_5 2 y_6 x1_6 x2_6 (...) n y_n x1_n x2_n n