similar to: helpful functions in R for testing results of tree ("party")

Displaying 20 results from an estimated 50000 matches similar to: "helpful functions in R for testing results of tree ("party")"

2009 Feb 27
0
[SoC09-Idea] Party On!
Hi Manuel, find our SoC proposal below. Best wishes, Torsten & Achim _______________________________________________________________________ Party On! New Recursive Partytioning Tools. Mentor: Torsten Hothorn & Achim Zeileis Short Description: The aim of the project is the implementation of recursive partitioning methods ("trees") which aren't available in R at the
2010 Jul 22
1
decision tree with weighted inputs
I'd like to train a decision tree on a set of weighted data points. I looked into the rpart package, which builds trees but doesn't seem to offer the capability of weighting inputs. (There is a weights parameter, but it seems to correspond to output classes rather than to input points). I'm making do for now by preprocessing my input data by adding multiple instances of each data
2010 Oct 21
1
Accuracy/Goodness of fit of nnet
Hi R-Helpers , am working on nnet package.Multinom() has an option for finding the goodness of fit by giving the AIC value. Does nnet also gives some value to determine the accuracy. If not, can you guide me with some procedure to figure out the accuracy/goodness of fit of nnet model? Thanks in advance. -- View this message in context:
2009 Jan 24
1
FW: [R] The Quality & Accuracy of R
Dear R Developers, This is my first time subscribing to this list, so let me start out by saying thank you all very much for the incredible contribution you have made to science through your work on R. As you all know many users of commercial stat packages, their managers, directors, CIOs etc. are skeptical of R's quality/accuracy. And as the recent NY Times article demonstrated, the
2009 Jan 23
2
The Quality & Accuracy of R
Hi All, We have all had to face skeptical colleagues asking if software made by volunteers could match the quality and accuracy of commercially written software. Thanks to the prompting of a recent R-help thread, I read, "R: Regulatory Compliance and Validation Issues, A Guidance Document for the Use of R in Regulated Clinical Trial Environments (http://www.r-project.org/doc/R-FDA.pdf).
2011 Mar 17
1
generalized mixed linear models, glmmPQL and GLMER give very different results that both do not fit the data well...
Hi, I have the following type of data: 86 subjects in three independent groups (high power vs low power vs control). Each subject solves 8 reasoning problems of two kinds: conflict problems and noconflict problems. I measure accuracy in solving the reasoning problems. To summarize: binary response, 1 within subject var (TYPE), 1 between subject var (POWER). I wanted to fit the following model:
2007 Jan 29
3
comparing random forests and classification trees
Hi, I have done an analysis using 'rpart' to construct a Classification Tree. I am wanting to retain the output in tree form so that it is easily interpretable. However, I am wanting to compare the 'accuracy' of the tree to a Random Forest to estimate how much predictive ability is lost by using one simple tree. My understanding is that the error automatically displayed by the two
2006 Feb 24
0
New `party' tools
Dear useRs, Version 0.8-1 of the `party' package will appear on CRAN and its mirrors in due course. This version implements two new tools: o `mob', an object-oriented implementation of a recently suggested algorithm for model-based recursive partitioning (Zeileis, Hothorn, Hornik, 2005) has been added. It works out of the box for partitioning (generalized) linear
2010 Mar 26
1
how to measure accuracy of regression tree?
Hello, for constructing regression tree, I am using rpart function. now after dividing dataset in to training and testing, I'm using predict for forecasting. how to measure accuracy of the predicted data? Thanks and Regards, Vibha. [[alternative HTML version deleted]]
2010 Aug 31
0
rpart - interpretation of results of tree on survival data
Hi All, I am fitting a tree to censored survival data using the rpart package and wanted to better understand the results. I am trying to interpret the output from the tree. I am interested in understanding what "yval" is for a survival tree. I see in the output of summary, the phrase "estimated rate". The estimated rate is 1 for the entire tree, and more of less for each
2009 Jul 15
2
storing lm() results and other objects in a list
to clean up some code I would like to make a list of arbitrary length to store?various objects for use in a loop sample code: ############ BEGIN SAMPLE ############## # You can see the need for a loop already linearModel1=lm(modelSource ~ .,mcReg) linearModel2=step(linearModel1) linearModel3=lm(modelSource ~ .-1,mcReg) linearModel4=step(linearModel3) #custom linearModel5=lm(modelSource ~ .
2016 Apr 14
3
Decision Tree and Random Forrest
I still need the output to match my requiremnt in my original post. With decision rules "clusters" and probability attached to them. The examples are sort of similar. You just provided links to general info about trees. Sent from my Verizon, Samsung Galaxy smartphone<div> </div><div> </div><!-- originalMessage --><div>-------- Original message
2012 Jan 19
1
ctree question
Hello. I have used the "party" package to generate a regression tree as follows: >origdata<-read.csv("origdata.csv") >ctrl<-ctree_control(mincriterion=0.99,maxdepth=10,minbucket=10) >test.ct<-ctree(Y~X1+X2+X3,data=origdata,control=ctrl) The above works fine. Orig data was my training data. I now have a test data file (testdata), and
2003 Jul 23
3
Boosting, bagging and bumping. Questions about R tools and predictions.
I'm interested in further understanding the differences in using many classification trees to improve classification rates. I'm also interested in finding out what I can do in R and which methods will allow prediction. Can anybody point me to a citation or discussion? Specifically, I want to classify remotely sensed imagery where training data is extracted on class membership by the user.
2011 Jan 11
0
Some questions concerning survival tree analysis using the rpart module
All the documentation that I have on survival splitting is found in the technical report you mention. However, there is both a short form and a long form of this on our web site, did you get the larger one (52 pages)? I admit it is not a lot. There are no other split algorithms implimented for survival data. It would be possible to add your own. Attached is a slightly updated version of the
2015 Feb 23
0
[Mesa-dev] [PATCH 2/2] nvc0/ir: improve precision of double RCP/RSQ results
Oh right. I think the NVIDIA blob executes those steps conditionally based on the upper bits not being 0x7ff (== infinity/nan). I should do the same thing here. [FWIW I was able to test the nv50 code last night and that one's a total fail for rcp/rsq... will need to port that over to my nvc0 and debug there.] On Mon, Feb 23, 2015 at 8:24 AM, Roland Scheidegger <sroland at vmware.com>
2012 Jun 15
0
argument "x" is missing, with no default - Please help find argument x
R programming question, not machine learning, although that's the content. Apologies to all for whom the following code is eye-burning. I am using foreach() to run a simulation on a randomForest model (actually conditional randomForest ... "party" package). The simulation is in two dimensions. examining how "mtry" and "ntrees" are related in terms of predictive
2010 Nov 09
1
randomForest parameters for image classification
I am implementing an image classification algorithm using the randomForest package. The training data consists of 31000+ training cases over 26 variables, plus one factor predictor variable (the training class). The main issue I am encountering is very low overall classification accuracy (a lot of confusion between classes). However, I know from other classifications (including a regular decision
2016 Apr 15
0
Decision Tree and Random Forrest
Since you only have 3 predictors, each categorical with a small number of categories, you can use expand.grid to make a data.frame containing all possible combinations and give that the predict method for your model to get all possible predictions. Something like the following untested code. newdata <- expand.grid( Humidity = levels(Humidity), #(High, Medium,Low)
2012 Aug 23
0
party package: ctree - survival data - extracting statistics/predictors
Dear R users, I am trying to apply the analysis processed in a paper, on the data I'm working with. The data is: 80 patients for which I have survival data (time - days, and event - binary), and microarray expression data for 200 genes (predictor continuous variables). My data matrix "data.test" has ncol: 202 and nrow: 80. What I want to do is: - run recursive partitioning on