search for: rulefit

Displaying 8 results from an estimated 8 matches for "rulefit".

2011 Mar 23
0
Rulefit with R and missing values
Hi, I'm using R to treat a table (with a lot of missing values) with Rulefit. The matter is when I use the command rfmod. Actually, I don't know how to deal with the error message. I don't know were "true" or "false" is missing. Someone can help me? Thanks The following part is the script I used with the error at the end. platform = "wi...
2008 Feb 26
0
Cryptic error message using RuleFit
Hello LIST, In using Rulefit I;ve bee nabel to fit a model using rulefit without incident. However, when trying to use intnull and interact things go astray. > rf=rulefit(x,"N", cat.vars="H", > not.used=c("G","T"),huber=0.9,path.speed=1); [snip] RuleFit model 2/26/2008 2:17p av...
2006 Dec 20
2
RuleFit & quantreg: partial dependence plots; showing an effect
Dear List, I would greatly appreciate help on the following matter: The RuleFit program of Professor Friedman uses partial dependence plots to explore the effect of an explanatory variable on the response variable, after accounting for the average effects of the other variables. The plot method [plot(summary(rq(y ~ x1 + x2, t=seq(.1,.9,.05))))] of Professor Koenker's quan...
2010 Oct 07
1
R: rulefit error on Linux
R version 2.8.1 (2008-12-22) on Linux 64-bit I am trying to run 'rulefit' function (Rule based Learning Ensembles). but I got the following error - > rulefit(x,y) Warning: This program is an suid-root program or is being run by the root user. The full text of the error or warning message cannot be safely formatted in this environment. You may get a more descript...
2008 Feb 19
1
How to use BayesTree or RBF for predict
Hi all, sorry for my english, but I don't speak yours language. I'm trying to use bart() and rbf(). The package I'm using now is "BayesTree" and "neural", respectively. I could create the models, but I can't predict my test data. Does anyone have such an experience? Any advice is appreciated! Thank you in advanced!. Andr? -- View this message in
2012 Nov 26
0
Webinar signup: Advances in Gradient Boosting: the Power of Post-Processing. December 14, 10-11 a.m., PST
...ental o Manufacturing o Adserving * Typical Post-Processing Steps * Techniques: o Generalized Path Seeker (GPS): modern high-speed LASSO-style regularized regression. o Importance Sampled Learning Ensembles (ISLE): identify and reweight the most influential trees. o Rulefit: ISLE on "steroids." Identify the most influential nodes and rules. * Case Study Example: o Output/Results without Post-Processing o Output/Results with Post-Processing o Demo [[alternative HTML version deleted]]
2012 Dec 13
0
Webinar: Advances in Gradient Boosting: the Power of Post-Processing. TOMORROW, 10-11 a.m., PST
...es o Biomedical o Environmental o Manufacturing o Adserving III. Typical Post-Processing Steps IV. Techniques: o Generalized Path Seeker (GPS): modern high-speed LASSO-style regularized regression. o Importance Sampled Learning Ensembles (ISLE): identify and reweight the most influential trees. o Rulefit: ISLE on "steroids." Identify the most influential nodes and rules. V. Case Study Example: o Output/Results without Post-Processing o Output/Results with Post-Processing o Demo [[alternative HTML version deleted]]
2006 Apr 07
2
a statistics question
Hi there, I have a statistics question on a classification problem: Suppose I have 1000 binary variables and one binary dependent variable. I want to find a way similar to PCA, in which I can find a couple of combinations of those variables to discriminate best according to the dependent variable. It is not only for dimension reduction, but more important, for finding best way to construct