Displaying 20 results from an estimated 20000 matches similar to: "randomForest"
2009 Mar 20
1
Pruning trees in a Random Forest
Hi all!
The randomForest in R enables us to prune the trees using the nodesize
feature where we can stop splitting a node if it contains less than the
specified no.of of records/entities at that node.
However is there a way to stop the tree growing after a specified number of
levels. To be more clear on what I mean by a level. Level 0 is the parent
node, Level 1 has 2 daughter nodes, Level 2 has
2008 Mar 09
1
sampsize in Random Forests
Hi all,
I have a dataset where each point is assigned to a class A, B, C, or
D. Each point is also assigned to a study site. Each study site is
coded with a number ranging between 1-100. This information is stored
in the vector studySites.
I want to run randomForests using stratified sampling, so I chose the option
strata = factor(studySites)
But I am not sure how to control the number of
2007 Jan 29
3
comparing random forests and classification trees
Hi,
I have done an analysis using 'rpart' to construct a Classification Tree. I
am wanting to retain the output in tree form so that it is easily
interpretable. However, I am wanting to compare the 'accuracy' of the tree
to a Random Forest to estimate how much predictive ability is lost by using
one simple tree. My understanding is that the error automatically displayed
by the two
2005 Oct 27
1
Repost: Examples of "classwt", "strata", and "sampsize" i n randomForest?
"classwt" in the current version of the randomForest package doesn't work
too well. (It's what was in version 3.x of the original Fortran code by
Breiman and Cutler, not the one in the new Fortran code.) I'd advise
against using it.
"sampsize" and "strata" can be use in conjunction. If "strata" is not
specified, the class labels will be used.
2011 Feb 15
1
[slightly OT] predict.randomForest and type=”prob”
Dear all ,
I would like to use the function randomForest to predict the probability
of relocation failure of a GPS collar as a function of several
environmental variables x (both factor and numeric: slope, vegetation,
etc.) on a given area. The response variable y is thus success
(0)/failure(1) of the relocation, and the sampling unit is the pixel of
a raster map. My aim is to build a map
2007 Jan 28
2
help with RandomForest classwt option
Hello there,
I am working on an extremely unbalanced two class classification problems. I
wanna use "classwt" with "down sampling" together. By checking the rfNews()
in R, it looks that classwt is not working yet. Then I looked at the
software from Salford. I did not find the down sampling option. I am
wondering if you have any experience to deal with this problem. Do you
2007 Jan 04
3
randomForest and missing data
Does anyone know a reason why, in principle, a call to randomForest
cannot accept a data frame with missing predictor values? If each
individual tree is built using CART, then it seems like this
should be possible. (I understand that one may impute missing values
using rfImpute or some other method, but I would like to avoid doing
that.)
If this functionality were available, then when the trees
2008 Jul 20
1
confusion matrix in randomForest
I have a question on the output generated by randomForest in classification
mode, specifically, the confusion matrix. The confusion matrix lists the
various classes and how the forest classified each one, plus the
classification error. Are these numbers essentially averages over all the
trees in the forest? If so, is there a way I can get the standard deviation
values out of the randomForest,
2005 Oct 27
1
Repost: Examples of "classwt", "strata", and "sampsize" in randomForest?
Sorry for the repost, but I've really been looking, and can't find any
syntax direction on this issue...
Just browsing the documentation, and searching the list came up short... I
have some unbalanced data and was wondering if, in a "0" v "1"
classification forest, some combo of these options might yield better
predictions when the proportion of one class is low (less
2012 Mar 23
1
Memory limits for MDSplot in randomForest package
Hello,
I am struggling to produce an MDS plot using the randomForest package
with a moderately large data set. My data set has one categorical
response variables, 7 predictor variables and just under 19000
observations. That means my proximity matrix is approximately 133000
by 133000 which is quite large. To train a random forest on this large
a dataset I have to use my institutions high
2003 Aug 20
2
RandomForest
Hello,
When I plot or look at the error rate vector for a random forest
(rf$err.rate) it looks like a descending function except for a few first
points of the vector with error rates values lower(sometimes much lower)
than the general level of error rates for a forest with such number of trees
when the error rates stop descending. Does it mean that there is a tree(s)
(that is built the first in
2008 Jun 15
1
randomForest, 'No forest component...' error while calling Predict()
Dear R-users,
While making a prediction using the randomForest function (package
randomForest) I'm getting the following error message:
"Error in predict.randomForest(model, newdata = CV) : No forest component
in the object"
Here's my complete code. For reproducing this task, please find my 2 data
sets attached ( http://www.nabble.com/file/p17855119/data.rar data.rar ).
2007 Apr 29
1
randomForest gives different results for formula call v. x, y methods. Why?
Just out of curiosity, I took the default "iris" example in the RF
helpfile...
but seeing the admonition against using the formula interface for large data
sets, I wanted to play around a bit to see how the various options affected
the output. Found something interesting I couldn't find documentation for...
Just like the example...
> set.seed(12) # to be sure I have
2005 Jul 07
2
randomForest
> From: Weiwei Shi
>
> it works.
> thanks,
>
> but: (just curious)
> why i tried previously and i got
>
> > is.vector(sample.size)
> [1] TRUE
Because a list is also a vector:
> a <- c(list(1), list(2))
> a
[[1]]
[1] 1
[[2]]
[1] 2
> is.vector(a)
[1] TRUE
> is.numeric(a)
[1] FALSE
Actually, the way I initialize a list of known length is by
2011 Nov 26
3
Question about randomForest
I've been using the R package randomForest but there is an aspect I
cannot work out the meaning of. After calling the randomForest
function, the returned object contains an element called prediction,
which is the prediction obtained using all the trees (at least that's
my understanding). I've checked that this prediction set has the error
rate as reported by err.rate.
However, if I
2004 Jan 20
1
random forest question
Hi,
here are three results of random forest (version 4.0-1).
The results seem to be more or less the same which is strange because I
changed the classwt.
I hoped that for example classwt=c(0.45,0.1,0.45) would result in fewer
cases classified as class 2. Did I understand something wrong?
Christian
x1rf <- randomForest(x=as.data.frame(mfilters[cvtrain,]),
2010 Jul 20
1
Random Forest - Strata
Hi all,
Had struggled in getting "Strata" in randomForest to work on this.
Can I get randomForest for each of its TREE, to get ALL sample from some
strata to build tree, while leaving some strata TOTALLY untouched as oob?
e.g. in below, how I can tell RF to,
- for tree 1 in the forest, to use only Site A and B to build the tree,
while using the WHOLE Site C data for the oob error
2007 Sep 05
1
ecological meaning of randomForest vegetation classification?
Hi, everyone,
I haven't found anything similar in the forum, so here's my problem (I'm no
expert in R nor statistics):
I have a data set of 59.000 cases with 9 variables each (fractional
coverage of 9 different plant types, such as deciduous broad-leaved
temperate trees or evergreen tropical trees etc.), which was generated by a
vegetation model.
In order to evaluate the quality of
2005 Sep 08
2
Re-evaluating the tree in the random forest
Dear mailinglist members,
I was wondering if there was a way to re-evaluate the
instances of a tree (in the forest) again after I have
manually changed a splitpoint (or split variable) of a
decision node. Here's an illustration:
library("randomForest")
forest.rf <- randomForest(formula = Species ~ ., data
= iris, do.trace = TRUE, ntree = 3, mtry = 2,
norm.votes = FALSE)
# I am
2008 Oct 09
1
Dump decision trees of randomForest object
Hi,
I'm using the package randomForest to generate a classifier for the exemplary
iris data set:
data(iris)
iris.rf<-randomForest(Species~.,iris)
Is it possible to print all decision trees in the generated forest?
If so, can the trees be also written to disk?
What I actually need is to translate the decision trees in a random forest
into equivalent C++ if-then-else constructs to