Displaying 9 results from an estimated 9 matches similar to: "Rulefit with R and missing values"
2008 Feb 26
0
Cryptic error message using RuleFit
Hello LIST,
In using Rulefit I;ve bee nabel to fit a model using rulefit without
incident. However, when trying to use intnull and interact things go
astray.
> rf=rulefit(x,"N", cat.vars="H",
> not.used=c("G","T"),huber=0.9,path.speed=1);
[snip]
RuleFit model 2/26/2008 2:17p
ave(abs(error)) terms path steps
84.16 110
2010 Oct 07
1
R: rulefit error on Linux
R version 2.8.1 (2008-12-22) on Linux 64-bit
I am trying to run 'rulefit' function (Rule based Learning Ensembles). but I
got the following error -
> rulefit(x,y)
Warning: This program is an suid-root program or is being run by the root
user.
The full text of the error or warning message cannot be safely formatted
in this environment. You may get a more descriptive message by running
2006 Dec 20
2
RuleFit & quantreg: partial dependence plots; showing an effect
Dear List,
I would greatly appreciate help on the following matter:
The RuleFit program of Professor Friedman uses partial dependence plots
to explore the effect of an explanatory variable on the response
variable, after accounting for the average effects of the other
variables. The plot method [plot(summary(rq(y ~ x1 + x2,
t=seq(.1,.9,.05))))] of Professor Koenker's quantreg program
2005 Jun 06
1
Missing values in argument of .Fortran.
I wish to pass a vector ``y'', some of whose entries are NAs to a
fortran subroutine which I am dynamically loading and calling by
means of .Fortran(). The subroutine runs through the vector entry by
entry; obviously I want to have it do one thing if y[i] is present
and a different thing if it is missing.
The way I am thinking of proceeding is along the xlines of:
ymiss <- is.na(y)
2012 Nov 26
0
Webinar signup: Advances in Gradient Boosting: the Power of Post-Processing. December 14, 10-11 a.m., PST
Webinar signup:
Advances in Gradient Boosting: the Power of Post-Processing
December 14, 10-11 a.m., PST
Webinar Registration:
http://2.salford-systems.com/gradientboosting-and-post-processing/
Course Outline:
* Gradient Boosting and Post-Processing:
o What is missing from Gradient Boosting?
o Why post-processing techniques are used?
* Applications Benefiting from
2012 Dec 13
0
Webinar: Advances in Gradient Boosting: the Power of Post-Processing. TOMORROW, 10-11 a.m., PST
Webinar: Advances in Gradient Boosting: the Power of Post-Processing
TOMORROW: December 14, 10-11 a.m., PST
Webinar Registration: http://2.salford-systems.com/gradientboosting-and-post-processing/
Course Outline:
I. Gradient Boosting and Post-Processing:
o What is missing from Gradient Boosting?
o Why post-processing techniques are used?
II. Applications Benefiting from Post-Processing:
2011 Jun 03
3
Not missing at random
Hello!
I would like to sample 30 % of cases (with at least 1 value lower than 3) and
among them I want to set all values lower than 3 (within selected cases) as NA
(NMAR- Not missing at random). I managed to sample cases, but I don’t know how
to set values (lower than 3) as NA.
R code:
x <-
2008 Feb 19
1
How to use BayesTree or RBF for predict
Hi all,
sorry for my english, but I don't speak yours language.
I'm trying to use bart() and rbf(). The package I'm using now is
"BayesTree" and "neural", respectively. I could create the models, but I
can't predict my test data.
Does anyone have such an experience? Any advice is appreciated!
Thank you in advanced!.
Andr?
--
View this message in
2006 Apr 07
2
a statistics question
Hi there,
I have a statistics question on a classification problem:
Suppose I have 1000 binary variables and one binary dependent variable. I
want to find a way similar to PCA, in which I can find a couple of
combinations of those variables to discriminate best according to the
dependent variable. It is not only for dimension reduction, but more
important, for finding best way to construct