search for: mtri

Displaying 20 results from an estimated 92 matches for "mtri".

Did you mean: mtrr
2011 Nov 16
0
problem to tunning RandomForest, an unexpected result
Dear Researches, I am using RF (in regression way) for analize several metrics extract from image. I am tuning RF setting a loop using different range of mtry, tree and nodesize using the lower value of MSE-OOB mtry from 1 to 5 nodesize from1 to 10 tree from 1 to 500 using this paper as refery Palmer, D. S., O'Boyle, N. M., Glen, R. C., & Mitchell, J. B. O. (2007). Random Forest Models
2012 Jun 15
0
argument "x" is missing, with no default - Please help find argument x
R programming question, not machine learning, although that's the content. Apologies to all for whom the following code is eye-burning. I am using foreach() to run a simulation on a randomForest model (actually conditional randomForest ... "party" package). The simulation is in two dimensions. examining how "mtry" and "ntrees" are related in terms of predictive
2004 Oct 13
1
random forest -optimising mtry
Dear R-helpers, I'm working on mass spectra in randomForest/R, and following the recommendations for the case of noisy variables, I don't want to use the default mtry (sqrt of nvariables), but I'm not sure up to which proportion mtry/nvariables it makes sense to increase mtry without "overtuning" RF. Let me tell my example: I have 106 spectra belonging to 4 classes, the
2005 Jul 21
4
RandomForest question
Hello, I'm trying to find out the optimal number of splits (mtry parameter) for a randomForest classification. The classification is binary and there are 32 explanatory variables (mostly factors with each up to 4 levels but also some numeric variables) and 575 cases. I've seen that although there are only 32 explanatory variables the best classification performance is reached when
2011 Nov 17
1
tuning random forest. An unexpected result
Dear Researches, I am using RF (in regression way) for analize several metrics extract from image. I am tuning RF setting a loop using different range of mtry, tree and nodesize using the lower value of MSE-OOB mtry from 1 to 5 nodesize from1 to 10 tree from 1 to 500 using this paper as refery Palmer, D. S., O'Boyle, N. M., Glen, R. C., & Mitchell, J. B. O. (2007). Random Forest Models
2010 Dec 21
1
randomForest: tuneRF error
Just curious if anyone else has got this error before, and if so, would know what I could do (if anything) to get past it: > mtry <- tuneRF(training, trainingdata$class, ntreeTry = 500, stepFactor = 2, improve = 0.05, trace = TRUE, plot = TRUE, doBest = FALSE) mtry = 13 OOB error = 0.62% Searching left ... mtry = 7 OOB error = 1.38% -1.222222 0.05 Searching right ... mtry = 26
2005 Jan 06
1
different result from the same errorest() in library( ipred)
Dear all, Does anybody can explain this: different results got when all the same parameters are used in the errorest() in library ipred, as the following? errorest(Species ~ ., data=iris, model=randomForest, estimator = "cv", est.para=control.errorest(k=3), mtry=2)$err [1] 0.03333333 > errorest(Species ~ ., data=iris, model=randomForest, estimator = "cv",
2007 Oct 11
1
random forest mtry and mse
I have been using random forest on a data set with 226 sites and 36 explanatory variables (continuous and categorical). When I use "tune.randomforest" to determine the best value to use in "mtry" there is a fairly consistent and steady decrease in MSE, with the optimum of "mtry" usually equal to 1. Why would that occur, and what does it signify? What I would
2009 Aug 13
2
randomForest question--problem with ntree
Hi, I would like to use a random Forest model to get an idea about which variables from a dataset may have some prognostic significance in a smallish study. The default for the number of trees seems to be 500. I tried changing the default to ntree=2000 or ntree=200 and the results appear identical. Have changed mtry from mtry=5 to mtry=6 successfully. Have seen same problem on both a Windows
2007 Nov 12
1
mtry in ctree_control()
Dear Group, What is the actual usage of "mtry" in ctree(), or specifically, ctree_control() since it's a single tree? Thanks in advance. Regards, Kelvin Lam, MSc. Analyst, Programming & Biostatistics Institute for Clinical Evaluative Sciences (ICES) 2075 Bayview Avenue, G179 Toronto, ON M4N 3M5 (416) 480-4055 Ext. 3057 Fax: (416) 480-6048 email:
2012 Aug 01
0
Questions regarding MCRestimate package
Hello, I'm currently using MCRestimate package and I have a question regarding the MCRestimate function. Here is my code: NestedCV.rf<-MCRestimate(eset, "Class", classificatin.fun="RF.wrap", variableSel.fun="varSel.highest.var", poss.parameters= list(var.numbers=c(100), mtry=c(10,50), cross.outer=10,cross.inner=10,cross.repeat=3) I'm pretty sure that I
2010 Mar 23
1
caret package, how can I deal with RFE+SVM wrong message?
Hello, I am learning caret package, and I want to use the RFE to reduce the feature. I want to use RFE coupled Random Forest (RFE+FR) to complete this task. As we know, there are a number of pre-defined sets of functions, like random Forest(rfFuncs), however,I want to tune the parameters (mtr) when RFE, and then I write code below, but there is something wrong message, How can I deal with it?
2006 Jul 26
0
randomForest question [Broadcast]
When mtry is equal to total number of features, you just get regular bagging (in the R package -- Breiman & Cutler's Fortran code samples variable with replacement, so you can't do bagging with that). There are cases when bagging will do better than random feature selection (i.e., RF), even in simulated data, but I'd say not very often. HTH, Andy From: Arne.Muller at
2013 Feb 03
3
RandomForest, Party and Memory Management
Dear All, For a data mining project, I am relying heavily on the RandomForest and Party packages. Due to the large size of the data set, I have often memory problems (in particular with the Party package; RandomForest seems to use less memory). I really have two questions at this point 1) Please see how I am using the Party and RandomForest packages. Any comment is welcome and useful.
2003 Apr 12
5
rpart vs. randomForest
Greetings. I'm trying to determine whether to use rpart or randomForest for a classification tree. Has anybody tested efficacy formally? I've run both and the confusion matrix for rf beats rpart. I've looking at the rf help page and am unable to figure out how to extract the tree. But more than that I'm looking for a more comprehensive user's guide for randomForest including
2012 Dec 03
2
Different results from random.Forest with test option and using predict function
Hello R Gurus, I am perplexed by the different results I obtained when I ran code like this: set.seed(100) test1<-randomForest(BinaryY~., data=Xvars, trees=51, mtry=5, seed=200) predict(test1, newdata=cbind(NewBinaryY, NewXs), type="response") and this code: set.seed(100) test2<-randomForest(BinaryY~., data=Xvars, trees=51, mtry=5, seed=200, xtest=NewXs, ytest=NewBinarY) The
2002 Apr 02
2
random forests for R
Hi all, There is now a package available on CRAN that provides an R interface to Leo Breiman's random forest classifier. Basically, random forest does the following: 1. Select ntree, the number of trees to grow, and mtry, a number no larger than number of variables. 2. For i = 1 to ntree: 3. Draw a bootstrap sample from the data. Call those not in the bootstrap sample the
2002 Apr 02
2
random forests for R
Hi all, There is now a package available on CRAN that provides an R interface to Leo Breiman's random forest classifier. Basically, random forest does the following: 1. Select ntree, the number of trees to grow, and mtry, a number no larger than number of variables. 2. For i = 1 to ntree: 3. Draw a bootstrap sample from the data. Call those not in the bootstrap sample the
2005 Aug 15
2
randomForest Error passing string argument
I'm attempting to pass a string argument into the function randomForest but I get an error: state <- paste(list("fruit ~", "apples+oranges+blueberries", "data=fruits.data, mtry=2, do.trace=100, na.action=na.omit, keep.forest=TRUE"), sep= " ", collapse="") model.rf <- randomForest(state) Error in if (n==0) stop ("data(x) has 0
2005 Mar 22
2
Error: Can not handle categorical predictors with more than 32 categories.
Hi All, My question is in regards to an error generated when using randomForest in R. Is there a special way to format the data in order to avoid this error, or am I completely confused on what the error implies? "Error in randomForest.default(m, y, ...) : Can not handle categorical predictors with more than 32 categories." This is generated from the command line: >