search for: mtry

Displaying 20 results from an estimated 92 matches for "mtry".

Did you mean: mtrr
2011 Nov 16
0
problem to tunning RandomForest, an unexpected result
Dear Researches, I am using RF (in regression way) for analize several metrics extract from image. I am tuning RF setting a loop using different range of mtry, tree and nodesize using the lower value of MSE-OOB mtry from 1 to 5 nodesize from1 to 10 tree from 1 to 500 using this paper as refery Palmer, D. S., O'Boyle, N. M., Glen, R. C., & Mitchell, J. B. O. (2007). Random Forest Models To Predict Aqueous Solubility. Journal of Chemical Informa...
2012 Jun 15
0
argument "x" is missing, with no default - Please help find argument x
...chine learning, although that's the content. Apologies to all for whom the following code is eye-burning. I am using foreach() to run a simulation on a randomForest model (actually conditional randomForest ... "party" package). The simulation is in two dimensions. examining how "mtry" and "ntrees" are related in terms of predictive accuracy in ten-fold cross validation. My problem is one of functional programming. The "loops" for simulation are functionalised so they can be passed to foreach and bundled of to my 4 cores. However, I'm making a mess o...
2004 Oct 13
1
random forest -optimising mtry
Dear R-helpers, I'm working on mass spectra in randomForest/R, and following the recommendations for the case of noisy variables, I don't want to use the default mtry (sqrt of nvariables), but I'm not sure up to which proportion mtry/nvariables it makes sense to increase mtry without "overtuning" RF. Let me tell my example: I have 106 spectra belonging to 4 classes, the number of variables is 172. I'm interested in finding information about...
2005 Jul 21
4
RandomForest question
Hello, I'm trying to find out the optimal number of splits (mtry parameter) for a randomForest classification. The classification is binary and there are 32 explanatory variables (mostly factors with each up to 4 levels but also some numeric variables) and 575 cases. I've seen that although there are only 32 explanatory variables the best classification per...
2011 Nov 17
1
tuning random forest. An unexpected result
Dear Researches, I am using RF (in regression way) for analize several metrics extract from image. I am tuning RF setting a loop using different range of mtry, tree and nodesize using the lower value of MSE-OOB mtry from 1 to 5 nodesize from1 to 10 tree from 1 to 500 using this paper as refery Palmer, D. S., O'Boyle, N. M., Glen, R. C., & Mitchell, J. B. O. (2007). Random Forest Models To Predict Aqueous Solubility. Journal of Chemical Informa...
2010 Dec 21
1
randomForest: tuneRF error
Just curious if anyone else has got this error before, and if so, would know what I could do (if anything) to get past it: > mtry <- tuneRF(training, trainingdata$class, ntreeTry = 500, stepFactor = 2, improve = 0.05, trace = TRUE, plot = TRUE, doBest = FALSE) mtry = 13 OOB error = 0.62% Searching left ... mtry = 7 OOB error = 1.38% -1.222222 0.05 Searching right ... mtry = 26 OOB error = 0.24% 0.6111111 0.05...
2005 Jan 06
1
different result from the same errorest() in library( ipred)
Dear all, Does anybody can explain this: different results got when all the same parameters are used in the errorest() in library ipred, as the following? errorest(Species ~ ., data=iris, model=randomForest, estimator = "cv", est.para=control.errorest(k=3), mtry=2)$err [1] 0.03333333 > errorest(Species ~ ., data=iris, model=randomForest, estimator = "cv", est.para=control.errorest(k=3), mtry=2)$err [1] 0.04 > errorest(Species ~ ., data=iris, model=randomForest, estimator = "cv", est.para=control.errorest(k=3), mtry=2)$err [1]...
2007 Oct 11
1
random forest mtry and mse
I have been using random forest on a data set with 226 sites and 36 explanatory variables (continuous and categorical). When I use "tune.randomforest" to determine the best value to use in "mtry" there is a fairly consistent and steady decrease in MSE, with the optimum of "mtry" usually equal to 1. Why would that occur, and what does it signify? What I would assume is that most of my explanatory variables have little to no explanatory power. Does that sound about right?...
2009 Aug 13
2
randomForest question--problem with ntree
...ike to use a random Forest model to get an idea about which variables from a dataset may have some prognostic significance in a smallish study. The default for the number of trees seems to be 500. I tried changing the default to ntree=2000 or ntree=200 and the results appear identical. Have changed mtry from mtry=5 to mtry=6 successfully. Have seen same problem on both a Windows machine and our linux system running 2.8 and 2.9. Sample code follws. Thanks in advance for help. Mary > m1<-as.formula(paste("as.factor(EAD)~", paste(names(clin_b)[c(5,7,10:36 )], collapse="+&q...
2007 Nov 12
1
mtry in ctree_control()
Dear Group, What is the actual usage of "mtry" in ctree(), or specifically, ctree_control() since it's a single tree? Thanks in advance. Regards, Kelvin Lam, MSc. Analyst, Programming & Biostatistics Institute for Clinical Evaluative Sciences (ICES) 2075 Bayview Avenue, G179 Toronto, ON M4N 3M5 (416) 480-4055 Ext. 305...
2012 Aug 01
0
Questions regarding MCRestimate package
...39;m currently using MCRestimate package and I have a question regarding the MCRestimate function. Here is my code: NestedCV.rf<-MCRestimate(eset, "Class", classificatin.fun="RF.wrap", variableSel.fun="varSel.highest.var", poss.parameters= list(var.numbers=c(100), mtry=c(10,50), cross.outer=10,cross.inner=10,cross.repeat=3) I'm pretty sure that I was providing "eset" and "Class" correctly. But I was always getting a warning message: InMCRestimate.default(eset,"Class",classification.fun="RF.wrap",: There may be at leas...
2010 Mar 23
1
caret package, how can I deal with RFE+SVM wrong message?
...FE+FR) to complete this task. As we know, there are a number of pre-defined sets of functions, like random Forest(rfFuncs), however,I want to tune the parameters (mtr) when RFE, and then I write code below, but there is something wrong message, How can I deal with it? > rfGrid<-expand.grid(.mtry=c(1:2)) > rfectrl<-rfeControl(functions=caretFuncs,method="cv",verbose=F,returnResamp="final",number=10) > subsets<-c(3,4) > set.seed(2) > rf.RFE<-rfe(trx,try,sizes=subsets,rfeControl=rfectrl,method="rf",tuneGrid=rfGrid) Loading required package: c...
2006 Jul 26
0
randomForest question [Broadcast]
When mtry is equal to total number of features, you just get regular bagging (in the R package -- Breiman & Cutler's Fortran code samples variable with replacement, so you can't do bagging with that). There are cases when bagging will do better than random feature selection (i.e., RF), even in s...
2013 Feb 03
3
RandomForest, Party and Memory Management
...d RandomForest packages. Any comment is welcome and useful. myparty <- cforest(SalePrice ~ ModelID+ ProductGroup+ ProductGroupDesc+MfgYear+saledate3+saleday+ salemonth, data = trainRF, control = cforest_unbiased(mtry = 3, ntree=300, trace=TRUE)) rf_model <- randomForest(SalePrice ~ ModelID+ ProductGroup+ ProductGroupDesc+MfgYear+saledate3+saleday+ salemonth, data = trainRF,na.action = na.omit, importance=TRUE, do.trac...
2003 Apr 12
5
rpart vs. randomForest
...ooking for a more comprehensive user's guide for randomForest including the benefits on using it with MDS. Can anybody suggest a general guide? I've been finding a lot of broken links and cs-type of web pages rather than an end-user's guide. Also people's experience on adjusting the mtry param would be useful. Breiman says that it isn't too sensitive but I'm curious if anybody has had a different experience with it. Thanks in advance and apologies if this is too general. Concerned about your privacy? Follow this link to get FREE encrypted email: https://www.hushmail.com/...
2012 Dec 03
2
Different results from random.Forest with test option and using predict function
Hello R Gurus, I am perplexed by the different results I obtained when I ran code like this: set.seed(100) test1<-randomForest(BinaryY~., data=Xvars, trees=51, mtry=5, seed=200) predict(test1, newdata=cbind(NewBinaryY, NewXs), type="response") and this code: set.seed(100) test2<-randomForest(BinaryY~., data=Xvars, trees=51, mtry=5, seed=200, xtest=NewXs, ytest=NewBinarY) The confusion matrices for the two forests I thought would be the same by v...
2002 Apr 02
2
random forests for R
Hi all, There is now a package available on CRAN that provides an R interface to Leo Breiman's random forest classifier. Basically, random forest does the following: 1. Select ntree, the number of trees to grow, and mtry, a number no larger than number of variables. 2. For i = 1 to ntree: 3. Draw a bootstrap sample from the data. Call those not in the bootstrap sample the "out-of-bag" data. 4. Grow a "random" tree, where at each node, the best split is chosen among mtry randomly selected var...
2002 Apr 02
2
random forests for R
Hi all, There is now a package available on CRAN that provides an R interface to Leo Breiman's random forest classifier. Basically, random forest does the following: 1. Select ntree, the number of trees to grow, and mtry, a number no larger than number of variables. 2. For i = 1 to ntree: 3. Draw a bootstrap sample from the data. Call those not in the bootstrap sample the "out-of-bag" data. 4. Grow a "random" tree, where at each node, the best split is chosen among mtry randomly selected var...
2005 Aug 15
2
randomForest Error passing string argument
I'm attempting to pass a string argument into the function randomForest but I get an error: state <- paste(list("fruit ~", "apples+oranges+blueberries", "data=fruits.data, mtry=2, do.trace=100, na.action=na.omit, keep.forest=TRUE"), sep= " ", collapse="") model.rf <- randomForest(state) Error in if (n==0) stop ("data(x) has 0 rows") argument is of length zero. -Thanks in advance,
2005 Mar 22
2
Error: Can not handle categorical predictors with more than 32 categories.
...this error, or am I completely confused on what the error implies? "Error in randomForest.default(m, y, ...) : Can not handle categorical predictors with more than 32 categories." This is generated from the command line: > credit.rf <- randomForest(V16 ~ ., data=credit, mtry=2, importance = TRUE, do.trace=100) The data set is the credit-screening data from the UCI respository, ftp://ftp.ics.uci.edu/pub/machine-learning-databases/credit-screening/crx.data. This data consists of 690 samples and 16 attributes. The attribute information includes: A1: b, a. A2: co...