Displaying 20 results from an estimated 20000 matches similar to: "Decision tree with the group median as response?"
2006 Apr 17
0
Problem getting R's decision tree for Quinlan's golf exam ple data [Broadcast]
See ?rpart.control. I get:
> golf.rp = rpart(Outlook ~ ., golf, control=rpart.control(minsplit=1))
> golf.rp
n= 14
node), split, n, loss, yval, (yprob)
* denotes terminal node
1) root 14 9 rain (0.2857143 0.3571429 0.3571429)
2) Temperature< 71.5 6 2 rain (0.1666667 0.6666667 0.1666667)
4) Temperature< 64.5 1 0 overcast (1.0000000 0.0000000 0.0000000) *
5)
2006 Apr 16
0
Problem getting R's decision tree for Quinlan's golf example data
Newbie question, but I've checked archives etc. Am trying to reproduce
in R Quinlan's trivial example of the "golf" decision tree. The data file
of 14 examples follows (read in via read.table()):
Outlook Temperature Humidity Windy PlayDontPlay
1 sunny 85 85 false DontPlay
2 sunny 80 90 true DontPlay
3 overcast 83 78 false Play
4 rain 70 96 false Play
5 rain 68 80 false Play
6
2006 Feb 16
0
sums of absolute deviations about the median as split function in rpart
Dear R community,
as stated in Breiman et.al. (1984) and De'Ath & Fabricius (2000) using
sums of absolute deviations about the median as an impurity measure
gives robust trees.
I would like to use this method in rpart.
Has somebody already tried this method in rpart? Is there maybe already
a script available somewhere?
I am aware of the possibility to define usersplits myself with
2011 Nov 04
1
Decision tree model using rpart ( classification
Hi Experts,
I am new to R, using decision tree model for getting segmentation rules.
A) Using behavioural data (attributes defining customer behaviour, ( example
balances, number of accounts etc.)
1. Clustering: Cluster behavioural data to suitable number of clusters
2. Decision Tree: Using rpart classification tree for generating rules for
segmentation using cluster number(cluster id) as target
2011 Apr 08
4
Rpart decision tree
Dear useRs:
I try to plot an rpart object but cannot get a nice tree structure plot. I
am using plot.rpart and text.rpart (please see below) but the branches that
connect the nodes overlap the text in the ellipses and rectangles. Is there
a way to get a clean nice tree plot (as in the Rpart Mayo report)? I work
under Windows and use R2.11.1 with rpart version 3.1-46.
Thank you.
Tudor
...
2010 Aug 13
1
decision tree finetune
My decision tree grows only with one split and based on what I see in
E-Miner it should split on more variables. How can I adjust splitting
criteria in R?
Also is there way to indicate that some variables are binary, like variable
Info_G is binary so in the results would be nice to see "2) Info_G=0"
instead of "2) Info_G<0.5".
Thank you in advance!
And thanks for Eric who
2010 Jul 22
1
decision tree with weighted inputs
I'd like to train a decision tree on a set of weighted data points. I looked into the rpart package, which builds trees but doesn't seem to offer the capability of weighting inputs. (There is a weights parameter, but it seems to correspond to output classes rather than to input points).
I'm making do for now by preprocessing my input data by adding multiple instances of each data
2011 Aug 29
2
rpart: apply tree to new data to get "counts"
Hi,
when I have made a decision tree with rpart, is it possible to "apply"
this tree to a new set of data in order to find out the distribution
of observations? Ideally I would like to plot my original tree, with
the counts (at each node) of the new data.
Reagards,
Jay
2010 Sep 04
1
Decision Tree in Python or C++?
Have anybody used Decision Tree in Python or C++? (or written their own
decision tree implementation in Python or C++)? My goal is to run decision
tree on 8 million obs as training set and score 7 million in test set.
I am testing 'rpart' package on a 64-bit-Linux + 64-bit-R environment. But
it seems that rpart is either not stable or running out of memory very
quickly. (Is it
2010 May 11
1
how to extract the variables used in decision tree
HI, Dear R community,
How to extract the variables actually used in tree construction? I want to
extract these variables and combine other variable as my features in next
step model building.
> printcp(fit.dimer)
Classification tree:
rpart(formula = outcome ~ ., data = p_df, method = "class")
Variables actually used in tree construction:
[1] CT DP DY FC NE NW QT SK TA WC WD WG WW
2011 Jan 11
0
Some questions concerning survival tree analysis using the rpart module
All the documentation that I have on survival splitting is found in the
technical report you mention. However, there is both a short form and a
long form of this on our web site, did you get the larger one (52
pages)? I admit it is not a lot.
There are no other split algorithms implimented for survival data. It
would be possible to add your own. Attached is a slightly updated
version of the
2005 Sep 09
1
Finding a decision tree's leaf node from a new value
Dear mailinglist members,
I have the following problem: I run a decision tree using the rpart function and, afterwords, I try to find to which leaf node a new register (not used to build the decision tree) belongs to.
I will try to explain better:
rpart.tree <- rpart(target.value ~., data)
leaf.node <- new.function(rpart.tree, new.register)
The new register has all the explanatory values
2012 Mar 05
1
decision/classification trees with fewer than 20 objects
Hi!
I'm trying to construct and plot a decision tree to class a set of only 8 objects and tried to use the rpart and tree function, but get a error message both times:
rpart: fit is not a tree, just a root
tree: cannot plot singlenode tree
I read in the post 'question regression trees' that rpart doesn't split a set of fewer than 20 objects...so I guess the same holds true for
2009 Jul 26
3
Question about rpart decision trees (being used to predict customer churn)
Hi,
I am using rpart decision trees to analyze customer churn. I am finding that
the decision trees created are not effective because they are not able to
recognize factors that influence churn. I have created an example situation
below. What do I need to do to for rpart to build a tree with the variable
experience? My guess is that this would happen if rpart used the loss matrix
while creating
2010 Aug 31
0
rpart - interpretation of results of tree on survival data
Hi All,
I am fitting a tree to censored survival data using the rpart package and
wanted to better understand the results.
I am trying to interpret the output from the tree. I am interested in
understanding what "yval" is for a survival tree. I see in the output of
summary, the phrase "estimated rate". The estimated rate is 1 for the entire
tree, and more of less for each
2013 Jun 14
1
How to interactively create manually guided Decision Tree
I am new in using R. I want to know all about building decision tree model
in R.
Few options which I searched are rpart and rattle to build a decision
tree.Both the functions are giving me splits which are statistically
appropriate.
But I am not able to figure out how to change those splits as per my
business requirement.
for example : the automatic split of Age by using rattle is > 30 and
2010 Aug 26
1
Decision tree and factor variables
Hello,
I'm building a decision tree in R with the rpart package. Modeling is
fine. But when it comes to scoring, I have the following issue:
factor 'cust_language' has new level(s) OT
I think this comes from the fact that when learning, the DT doesn't
see all the possible value of the factor variable cust_language. When
scoring, new values comes and I get this error. However, it
2012 May 21
1
Need Help in K-fold validation in Decision tree
Hi ,
I have built decision tree using rpart . I want to do k Fold validation on
the decision tree .
Could you help how can i do that .. please tell the package which required
for K fold validation.
Regards,
Santosh
--
View this message in context: http://r.789695.n4.nabble.com/Need-Help-in-K-fold-validation-in-Decision-tree-tp4630730.html
Sent from the R help mailing list archive at
2016 Apr 13
0
Decision Tree and Random Forrest
Tjats great that you are familiar and thanks for responding. Have you ever
done what I am referring to? I have alteady spent time going through links
and tutorials about decision trees and random forrests and have even used
them both before.
Mike
On Apr 13, 2016 5:32 PM, "Sarah Goslee" <sarah.goslee at gmail.com> wrote:
It sounds like you want classification or regression trees.
2016 Apr 14
3
Decision Tree and Random Forrest
I still need the output to match my requiremnt in my original post. With decision rules "clusters" and probability attached to them. The examples are sort of similar. You just provided links to general info about trees.
Sent from my Verizon, Samsung Galaxy smartphone<div>
</div><div>
</div><!-- originalMessage --><div>-------- Original message