similar to: rpart weight parameter and random forest based on rpart

Displaying 20 results from an estimated 10000 matches similar to: "rpart weight parameter and random forest based on rpart"

2005 Jan 25
0
Collapsing solution to the question discussed above: Re: multi-class classification using rpart
You could break your 3 class problem into several (2 or 3) 2 class problems, and then use Andy's suggestion (see the CART book). There are several ways to break the problem into 2 class problems, and several ways to combine the resulting classifiers. Tom Dietterich, Jerry Friedman, Trevor Hastie and Rob Tibshirani, among others, have articles on the question, in places like Annals of
2011 Jun 21
0
How does rpart computes "improve" for split="information"?? (which seems to be different then the "gini" case)
Hello dear R-help members, I would appreciate any help in understanding how the rpart function computes the "improve" (which is given in fit$split) when using the split='information' parameter. Thanks to Professor Atkinson help, I was able to find how this is done in the case that split='gini'. By following the explanation here:
2011 Jun 13
1
In rpart, how is "improve" calculated? (in the "class" case)
Hi all, I apologies in advance if I am missing something very simple here, but since I failed at resolving this myself, I'm sending this question to the list. I would appreciate any help in understanding how the rpart function is (exactly) computing the "improve" (which is given in fit$split), and how it differs when using the split='information' vs split='gini'
2002 Jan 25
0
rpart subsets
A few weeks back I posted that the subset feature of rpart was not working when predicting a categorical variable. I was able to figure out a simple solution to the problem that I hope can be included in future editions of rpart. I also include a fix for another related problem. The basic problem is that when predicting a categorical using a subset, the subset may not have all the categories
2002 Jan 28
0
rpart subset fix
(Apparently, I posted this to the wrong place. I am hopefully posting this is the correct place now. If not, please advise.) A few weeks back I posted that the subset feature of rpart was not working when predicting a categorical variable. I was able to figure out a simple solution to the problem that I hope can be included in future editions of rpart. I also include a fix for another related
2005 Jan 25
3
multi-class classification using rpart
Hi, I am trying to make a multi-class classification tree by using rpart. I used MASS package'd data: fgl to test and it works well. However, when I used my small-sampled data as below, the program seems to take forever. I am not sure if it is due to slowness or there is something wrong with my codes or data manipulation. Please be advised ! The data is described as the output from str()
2011 Sep 13
1
class weights with Random Forest
Hi All, I am looking for a reference that explains how the randomForest function in the randomForest package uses the classwt parameter. Here: http://tolstoy.newcastle.edu.au/R/e4/help/08/05/12088.html Andy Liaw suggests not using classwt. And according to: http://r.789695.n4.nabble.com/R-help-with-RandomForest-classwt-option-td817149.html it has "not been implemented" as of 2007.
2002 Mar 29
1
memory error with rpart()
Dear all, I have a 100 iteration loop. Within each loop, there are some calls to rpart() like: ctl <- rpart.control(maxcompete=0, maxsurrogate=0, maxdepth=10) temp <- rpart(y~., x, w=wt, method="class", parms=list(split="gini"), control=ctl) res <- log(predict.rpart(temp, type="prob")) newres <- log(predict.rpart(temp, newdata=newx,
2007 Aug 24
2
Variable Importance - Random Forest
Hello, I am trying to explore the use of random forests for classification and am certain about the interpretation of the importance measurements. When having the option "importance = T" in the randomForest call, the resulting 'importance' element matrix has four columns with the following headings: 0 - mean raw importance score of variable x for class 0 (where
2012 Apr 03
1
rpart error message
Hi R-helpers, I am using rpart package for decision tree using R.We are invoking R environment through JRI from our java application.Hence, the result of R command is returned in REXP and we use geterrMessage() to retrieve the error. When we execute the following command, cnr_model<-rpart(as.factor(Species)~Sepal Length+Sepal Width+Petal Length, method="class",
2005 Jan 17
1
rpart
Hi, there: I am working on a classification problem by using rpart. when my response variable y is binary, the trees grow very fast, but if I add one more case to y, that is making y has 3 cases, the tree growing cannot be finished. the command looks like: x<-rpart(r0$V142~.,data=r0[,1:141], parms=list(split='gini'), cp=0.01) changing cp or removing parms does not help.
2009 May 21
1
Rpart - best split selection for class method and Gini splitting index
Dear R-users, I'm working with the Rpart package and trying to understand how the procedure select the best split in the case the method "class" and the splitting index "Gini" are used. In particular I'd like to have look to the source code that works out the best split for un unordered predictor. Does anyone can suggest me which functions in the sources I should
2007 Jul 08
1
rpart weight prior
Hi! Could you please explain the difference between "prior" and "weight" in rpart? It seems to be the same. But in this case why including a weight option in the latest versions? For an unbalanced sampling what is the best to use : weight, prior or the both together? Thanks a lot. Aur?lie Davranche.
2011 Feb 18
0
Weights in bagged regression trees
Has anyone any experience of applying observational weights in bagging? I am performing regression trees (continuous data on bird abundance) and need to account for sampling intensity. In a single tree, i.e. a call of rpart, I can specify weights either by having a separate vector called weights, or by a variable called weights in the dataframe under analysis. Both produce sensible (and identical)
2004 Jun 04
1
rpart
Hello everyone, I'm a newbie to R and to CART so I hope my questions don't seem too stupid. 1.) My first question concerns the rpart() method. Which method does rpart use in order to get the best split - entropy impurity, Bayes error (min. error) or Gini index? Is there a way to make it use the entropy impurity? The second and third question concern the output of the printcp() function.
2011 Oct 10
1
pmml for random forest & rules
Hi, I am having some trouble using R 2.13.1 for generating a pmml object of of class "c('randomForest.formula', 'randomForest')" I see that these methods are available: > methods(pmml) [1] pmml.coxph* pmml.hclust* pmml.itemsets* pmml.kmeans* pmml.ksvm* pmml.lm* pmml.multinom* pmml.nnet* pmml.rpart* [10] pmml.rsf* pmml.rules* pmml.survreg*
2006 Aug 24
0
Classification tree with a random variable
Hi, I am planning on using classification trees to build a predictive model for data which includes a random variable. I intend to use the R functions 'rpart' (and potentially also 'randomForest' and 'bagging'). I have a data set with 390 data points. The response variable is binary. There are a large number of variables (>20, both categorical and continuous). The
2004 Jan 12
0
new version of randomForest (4.0-7)
Dear R users, I've just released a new version of randomForest (available on CRAN now). This version contained quite a number of new features and bug fixes, compared to version prior to 4.0-x (and few more since 4.0-1). For those not familiar with randomForest, it's an ensemble classifier/regression tool. Please see http://www.math.usu.edu/~adele/forests/ for more detailed information,
2004 Jan 12
0
new version of randomForest (4.0-7)
Dear R users, I've just released a new version of randomForest (available on CRAN now). This version contained quite a number of new features and bug fixes, compared to version prior to 4.0-x (and few more since 4.0-1). For those not familiar with randomForest, it's an ensemble classifier/regression tool. Please see http://www.math.usu.edu/~adele/forests/ for more detailed information,
2011 Jan 11
0
Some questions concerning survival tree analysis using the rpart module
All the documentation that I have on survival splitting is found in the technical report you mention. However, there is both a short form and a long form of this on our web site, did you get the larger one (52 pages)? I admit it is not a lot. There are no other split algorithms implimented for survival data. It would be possible to add your own. Attached is a slightly updated version of the