similar to: stepwise variable selection with multiple dependent variables

Displaying 20 results from an estimated 4000 matches similar to: "stepwise variable selection with multiple dependent variables"

2011 Jun 21
1
Stepwise Manova
Hello all, I have a question on manova in R: I'm using the function "manova()" from the stats package. Is there anything like a stepwise (backward or forward) manova in R (like there is for regression and anova). When I enter: step(Model1, data=Mydata) R returns the message: Error in drop1.mlm(fit, scope$drop, scale = scale, trace = trace, k = k, : no 'drop1'
2010 Apr 16
1
Multiple comparisons on Anova.mlm object
I would like to perform multiple comparisons or post-hoc testing on the independent variable in an Anova.mlm object generated by the Anova function of the car package. I have defined a multivariate linear model and subsequently performed a repeated measures ANOVA as per the instructions in section #3 of the following comprehensive tutorial on the subject from the Gribble lab at UWO:
2005 Feb 24
2
Forward Stepwise regression based on partial F test
I am hoping to get some advise on the following: I am looking for an automatic variable selection procedure to reduce the number of potential predictor variables (~ 50) in a multiple regression model. I would be interested to use the forward stepwise regression using the partial F test. I have looked into possible R-functions but could not find this particular approach. There is a function
2011 Sep 06
1
repeatable segfault
Hi. macosx 10.6.8 With R-2.13.1 and also revision 56948 I get the following repeatable segfault: wt118:~% R --vanilla --quiet > R.Version() $platform [1] "x86_64-apple-darwin9.8.0" $arch [1] "x86_64" $os [1] "darwin9.8.0" $system [1] "x86_64, darwin9.8.0" $status [1] "" $major [1] "2" $minor [1] "13.1" $year [1]
2011 Jun 20
1
Stepwise model comparisons for mlogit
I am trying to perform a backwards stepwise variable selection with an mlogit model. The usual functions, step(), drop1(), and dropterm() do not work for mlogit models. Update() works but I am only able to use it manually, i.e. I have to type in each variable I wish to remove by hand on a separate line. My goal is to write some code that will systematically remove a certain set of variables
2013 Jan 23
1
Evaluating the significance of the random effects in GLMM
Hi all! I am working with GLMM using the binomial family I use the following codes I dropped no significant terms, refitting the model and comparing the changes with likelihood: G.1<-lmer(data$Ymat~stu+spi+stu*sp1+(1|ber),data=data,family="binomial") G.1b<-lmer(data$Ymat~stu+spi+(1|ber),data=data,family="binomial") anova (G.1,G.2) But, when I want to evaluate the
2007 Aug 02
1
simulate() and glm fits
Dear All, I have been trying to simulate data from a fitted glm using the simulate() function (version details at the bottom). This works for lm() fits and even for lmer() fits (in lme4). However, for glm() fits its output does not make sense to me -- am I missing something or is this a bug? Consider the following count data, modelled as gaussian, poisson and binomial responses: counts
2005 Feb 22
3
Reproducing SAS GLM in R
Hi, I'm still trying to figure out that GLM procedure in SAS. Let's start with the simple example: PROC GLM; MODEL col1 col3 col5 col7 col9 col11 col13 col15 col17 col19 col21 col23 =/nouni; repeated roi 6, ord 2/nom mean; TITLE 'ABDERUS lat ACC 300-500'; That's the same setup that I had in my last email. I have three factors: facSubj,facCond and facRoi. I had this pretty
2005 Feb 25
0
Bayesian stepwise (was: Forward Stepwise regression based onpartial F test)
oops, Forgot to cc to the list. Regards, Mike -----Original Message----- From: dr mike [mailto:dr.mike at ntlworld.com] Sent: 24 February 2005 19:21 To: 'Spencer Graves' Subject: RE: [R] Bayesian stepwise (was: Forward Stepwise regression based onpartial F test) Spencer, Obviously the problem is one of supersaturation. In view of that, are you aware of the following? A Two-Stage
2012 Nov 02
1
add1() alternative
Hi, I'm trying to build a hierarchical logistic regression model with lme4 package, but I have a problem on selecting the variables to include in this model. In a simple logistic regression, using Forward selection, i use a likelihood ratio test to check which variables i should include in the model, using the function add1(). The problem is that this function doesn't work with the
2008 Sep 27
10
FW: logistic regression
Sorry. Let me try again then. I am trying to find "significant" predictors" from a list of about 44 independent variables. So I started with all 44 variables and ran drop1(sep22lr, test="Chisq")... and then dropped the highest p value from the run. Then I reran the drop1. Model: MIN_Mstocked ~ ORG_CODE + BECLBL08 + PEM_SScat + SOIL_MST_1 + SOIL_NUTR + cE + cN +
2006 Feb 16
2
MANOVA: how do I read off within and between Sum-of-Squares info from the manova result?
Hi all, I am experimenting the function "manova" in R. I tried it on a few data sets, but I did not understand the result: I used "summary(manova_result)" and "summary(manova_result, test='Wilks')" and they gave a bunch of numbers... But I need the Sum-of-Squares of BETWEEN and WITHIN matrices... How do I read off from the R's manova results? Any
2000 Jun 07
1
forward stepwise selection
Dear R-Help, My problem/bug came to light,when fitting a linear model using stepwise selection. I'd started with the straightfoward command step(lm(y~., dataset)) This worked fine, but because this starts with all the possible explanatory variables, it results in a model with too many explanatory variables. Hence I wanted to start with just a constant and do forward selection, to get a
2005 Apr 23
1
question about about the drop1
the data is : >table.8.3<-data.frame(expand.grid( marijuana=factor(c("Yes","No"),levels=c("No","Yes")), cigarette=factor(c("Yes","No"),levels=c("No","Yes")), alcohol=factor(c("Yes","No"),levels=c("No","Yes"))), count=c(911,538,44,456,3,43,2,279))
2002 Feb 08
2
bugs or imperfect implementation?
I am using R to teach, and here are a couple of things that I thought would work didn't work. 1. I noticed the utility data(***,package=***) recently and like it very much, but unless I type in the whole word "package" I'll get an error in 1.4.0. For example, data(cats,package=MASS) works fine but data(cats,pac=MASS) doesn't. 2. drop1 doesn't seem to be as smart as
2008 Sep 30
2
weird behavior of drop1() for polr models (MASS)
I would like to do a SS type III analysis on a proportional odds logistic regression model. I use drop1(), but dropterm() shows the same behaviour. It works as expected for regular main effects models, however when the model includes an interaction effect it seems to have problems with matching the parameters to the predictor terms. An example: library("MASS"); options(contrasts =
2008 Aug 01
5
drop1() seems to give unexpected results compare to anova()
Dear all, I have been trying to investigate the behaviour of different weights in weighted regression for a dataset with lots of missing data. As a start I simulated some data using the following: library(MASS) N <- 200 sigma <- matrix(c(1, .5, .5, 1), nrow = 2) sim.set <- as.data.frame(mvrnorm(N, c(0, 0), sigma)) colnames(sim.set) <- c('x1', 'x2') # x1 & x2 are
2005 Oct 20
3
different F test in drop1 and anova
Hi, I was wondering why anova() and drop1() give different tail probabilities for F tests. I guess overdispersion is calculated differently in the following example, but why? Thanks for any advice, Tom For example: > x<-c(2,3,4,5,6) > y<-c(0,1,0,0,1) > b1<-glm(y~x,binomial) > b2<-glm(y~1,binomial) > drop1(b1,test="F") Single term deletions Model: y ~
2000 May 09
1
Type III Sums of Squares?
Hello, I'd like to propose an extension to the function summary.aov. In Splus (2000, I don't know about other versions), summary.aov allows a parameter ssType to be set to 1 or 3 (defaults to 1) to choose the type of Sums of Squares. I know I can get Type III SS in R with drop1(model), but including the functionality into summary.aov would, in my opinion, - yield a more usable table
2008 Aug 10
1
(Un-)intentional change in drop1() "Chisq" behaviour?
Dear List, recently tried to reproduce the results of some custom model selection function after updating R, which unfortunately failed. However, I ultimately found the issue to be that testing with pchisq() in drop1() seems to have changed. In the below example, earlier versions (e.g. R 2.4.1) produce a missing P-value for the variable x, while newer versions (e.g. R 2.7.1) produce 0 (2.2e-16).