Displaying 20 results from an estimated 4000 matches similar to: "randomForest parameters for image classification"
2008 Jun 15
1
randomForest, 'No forest component...' error while calling Predict()
Dear R-users,
While making a prediction using the randomForest function (package
randomForest) I'm getting the following error message:
"Error in predict.randomForest(model, newdata = CV) : No forest component
in the object"
Here's my complete code. For reproducing this task, please find my 2 data
sets attached ( http://www.nabble.com/file/p17855119/data.rar data.rar ).
2011 Nov 17
1
tuning random forest. An unexpected result
Dear Researches,
I am using RF (in regression way) for analize several metrics extract from
image. I am tuning RF setting a loop using different range of mtry, tree
and nodesize using the lower value of MSE-OOB
mtry from 1 to 5
nodesize from1 to 10
tree from 1 to 500
using this paper as refery
Palmer, D. S., O'Boyle, N. M., Glen, R. C., & Mitchell, J. B. O. (2007).
Random Forest Models
2010 May 25
1
Need Help! Poor performance about randomForest for large data
Hi, dears,
I am processing some data with 60 columns, and 286,730 rows.
Most columns are numerical value, and some columns are categorical value.
It turns out that: when ntree sets to the default value (500), it says "can
not allocate a vector of 1.1 GB size"; And when I set ntree to be a very
small number like 10, it will run for hours.
I use the (x,y) rather than the (formula,data).
2013 Mar 24
1
Random Forest, Giving More Importance to Some Data
Dear All,
I am using randomForest to predict the final selling price of some items.
As it often happens, I have a lot of (noisy) historical data, but the
question is not so much about data cleaning.
The dataset for which I need to carry out some predictions are fairly
recent sales or even some sales that will took place in the near future.
As a consequence, historical data should be somehow
2010 May 05
1
What is the default nPerm for regression in randomForest?
Could not find it in ?randomForest.
Thank you for your help!
--
Dimitri Liakhovitski
Ninah.com
Dimitri.Liakhovitski at ninah.com
2011 Nov 16
0
problem to tunning RandomForest, an unexpected result
Dear Researches,
I am using RF (in regression way) for analize several metrics extract from
image. I am tuning RF setting a loop using different range of mtry, tree
and nodesize using the lower value of MSE-OOB
mtry from 1 to 5
nodesize from1 to 10
tree from 1 to 500
using this paper as refery
Palmer, D. S., O'Boyle, N. M., Glen, R. C., & Mitchell, J. B. O. (2007).
Random Forest Models
2009 Apr 07
1
Concern with randomForest
Hi all,
When running a randomForest run using the following command:
forestplas=randomForest(Prev~.,data=plas,ntree=200000)
print(forestplas)
I get the following result:
Call:
randomForest(formula = Prev ~ ., data = plas, ntree = 2e+05,
importance = TRUE)
Type of random forest: regression
Number of trees: 2e+05
No. of variables tried at each split: 5
2005 Jul 21
4
RandomForest question
Hello,
I'm trying to find out the optimal number of splits (mtry parameter) for a randomForest classification. The classification is binary and there are 32 explanatory variables (mostly factors with each up to 4 levels but also some numeric variables) and 575 cases.
I've seen that although there are only 32 explanatory variables the best classification performance is reached when
2005 Sep 08
2
Re-evaluating the tree in the random forest
Dear mailinglist members,
I was wondering if there was a way to re-evaluate the
instances of a tree (in the forest) again after I have
manually changed a splitpoint (or split variable) of a
decision node. Here's an illustration:
library("randomForest")
forest.rf <- randomForest(formula = Species ~ ., data
= iris, do.trace = TRUE, ntree = 3, mtry = 2,
norm.votes = FALSE)
# I am
2005 Aug 15
2
randomForest Error passing string argument
I'm attempting to pass a string argument into the function
randomForest but I get an error:
state <- paste(list("fruit ~", "apples+oranges+blueberries",
"data=fruits.data, mtry=2, do.trace=100, na.action=na.omit,
keep.forest=TRUE"), sep= " ", collapse="")
model.rf <- randomForest(state)
Error in if (n==0) stop ("data(x) has 0
2013 Feb 03
3
RandomForest, Party and Memory Management
Dear All,
For a data mining project, I am relying heavily on the RandomForest and
Party packages.
Due to the large size of the data set, I have often memory problems (in
particular with the Party package; RandomForest seems to use less memory).
I really have two questions at this point
1) Please see how I am using the Party and RandomForest packages. Any
comment is welcome and useful.
2003 Apr 12
5
rpart vs. randomForest
Greetings. I'm trying to determine whether to use rpart or randomForest
for a classification tree. Has anybody tested efficacy formally? I've
run both and the confusion matrix for rf beats rpart. I've looking at
the rf help page and am unable to figure out how to extract the tree.
But more than that I'm looking for a more comprehensive user's guide
for randomForest including
2010 Nov 10
2
randomForest can not handle categorical predictors with more than 32 categories
I received this error
Error in randomForest.default(m, y, ...) :
Can not handle categorical predictors with more than 32 categories.
using below code
library(randomForest)
library(MASS)
memory.limit(size=12999)
x <- read.csv("D:/train_store_title_view.csv", header=TRUE)
x <- na.omit(x)
set.seed(131)
sales.rf <- randomForest(sales ~ ., data=x, mtry=3,
importance=TRUE)
My
2008 Dec 26
2
about randomForest
hello,
I want to use randomForest to classify a matrix which is 331030?42,the last column is class signal.I use ?
Memebers.rf<-randomForest(class~.,data=Memebers,proximity=TRUE,mtry=6,ntree=200) which told me" the error is matrix(0,n,n) set too elements"
then I use:
Memebers.rf<-randomForest(class~.,data=Memebers,importance=TRUE,proximity=TRUE) which told me"the error is
2010 Jan 15
1
randomForest maxnodes
Has anyone sucessfully used the maxnodes feature in randomForest? I tried
setting it, but when it is non-NULL I always get back a forest in which all
trees have size 1. I am using a continuous response (regression). Any help
would be appreciated.
Thanks.
[[alternative HTML version deleted]]
2006 Mar 08
8
how to use the randomForest and rpart function?
Hi all,
I am trying to play around with the randomForest function for
classification. I know its performance is great.
I am currently using the default options.
It has many options.
How do I further tweak the options so that I can make its performance even
better?
What are the options that are mostly used?
Thanks a lot!
M
[[alternative HTML version deleted]]
2004 Apr 05
3
Can't seem to finish a randomForest.... Just goes and goe s!
When you have fairly large data, _do not use the formula interface_, as a
couple of copies of the data would be made. Try simply:
Myforest.rf <- randomForest(Mydata[, -46], Mydata[,46],
ntrees=100, mtry=7)
[Note that you don't need to set proximity (not proximities) or importance
to FALSE, as that's the default already.]
You might also want to use
2010 Jul 14
1
randomForest outlier return NA
Dear R-users,
I have a problem with randomForest{outlier}.
After running the following code ( that produces a silly data set and builds
a model with randomForest ):
#######################
library(randomForest)
set.seed(0)
## build data set
X <- rbind( matrix( runif(n=400,min=-1,max=1), ncol = 10 ) ,
rep(1,times= 10 ) )
Y <- matrix( nrow = nrow(X), ncol = 1)
for( i in (1:nrow(X))){
2010 Mar 16
1
Regarding variable importance in the randomForest package
For anyone who is knowledgeable about the randomForest package in R, I have
a question:
When I look at the variable importance for data, I see that my response
variable is included along with my predictor variables. That is, I am
getting a MeanDecreaseGini for my response variable, and therefore it seems
as though it is being treated as a predictor variable.
my code (just in case it helps) :
2018 Jan 22
2
Random Forests
Muchas gracias Carlos, como siempre.
Es raro que se me pasase. En su momento miré todos los argumentos del
RF, como hago siempre, pero ese lo había olvidado. La verdad es que
funcionaba estupendamente, pero me parecía extraño. Aunque dado que
los RF no sobreajustan, no hay problema con que sus árboles sean todo
lo grandes que quieras. Lo he testado con una base de datos externa y
explica