Displaying 20 results from an estimated 1000 matches similar to: "boosting - second posting"
2006 May 25
0
boosting
Hi
I am using boosting for a classification and prediction problem.
For some reason it is giving me an outcome that doesn't fall between 0
and 1 for the predictions. I have tried type="response" but it made no
difference.
Can anyone see what I am doing wrong?
Screen output shown below:
> boost.model <- gbm(as.factor(train$simNuance) ~ ., # formula
+
2010 Apr 26
3
R.GBM package
HI, Dear Greg,
I AM A NEW to GBM package. Can boosting decision tree be implemented in
'gbm' package? Or 'gbm' can only be used for regression?
IF can, DO I need to combine the rpart and gbm command?
Thanks so much!
--
Sincerely,
Changbin
--
[[alternative HTML version deleted]]
2009 Jul 10
1
help! Error in using Boosting...
Here is my code:
mygbm<-gbm.fit(y=mytraindata[, 1], x=mytraindata[, -1],
interaction.depth=4, shrinkage=0.001, n.trees=20000, bag.fraction=1,
distribution="bernoulli")
Here is the error:
Error in gbm.fit(y = mytraindata[, 1], x = mytraindata[, -1],
interaction.depth = 4, :
The dataset size is too small or subsampling rate is too large:
cRows*train.fraction*bag.fraction <=
2009 Jun 17
1
gbm for cost-sensitive binary classification?
I recently use gbm for a binary classification problem. As expected, it gets very good results, based on Area under ROC with 7-fold cross validation. However, the application (malware detection) is cost-sensitive, getting a FP (classify a clean sample as a dirty one) is much worse than getting a FN (miss a dirty sample). I would like to tune the gbm model biased to very low FP rate.
For this
2013 Mar 24
3
Parallelizing GBM
Dear All,
I am far from being a guru about parallel programming.
Most of the time, I rely or randomForest for data mining large datasets.
I would like to give a try also to the gradient boosted methods in GBM,
but I have a need for parallelization.
I normally rely on gbm.fit for speed reasons, and I usually call it this
way
gbm_model <- gbm.fit(trainRF,prices_train,
offset = NULL,
misc =
2013 Jun 23
1
Which is the final model for a Boosted Regression Trees (GBM)?
Hi R User,
I was trying to find a final model in the following example by using the Boosted regression trees (GBM). The program gives the fitted values but I wanted to calculate the fitted value by hand to understand in depth. Would you give moe some hints on what is the final model for this example?
Thanks
KG
-------
The following script I used
#-----------------------
library(dismo)
2010 Feb 28
1
Gradient Boosting Trees with correlated predictors in gbm
Dear R users,
I’m trying to understand how correlated predictors impact the Relative
Importance measure in Stochastic Boosting Trees (J. Friedman). As Friedman
described “ …with single decision trees (referring to Brieman’s CART
algorithm), the relative importance measure is augmented by a strategy
involving surrogate splits intended to uncover the masking of influential
variables by others
2005 Jul 12
1
SOS Boosting
Hi,
I am trying to implement the Adaboost.M1. algorithm as described in
"The Elements of Statistical Learning" p.301
I don't use Dtettling 's library "boost" because :
- I don't understande the difference beetween Logitboost and L2boost
- I 'd like to use larger trees than stumps.
By using option weights set to (1/n, 1/n, ..., 1/n) in rpart or tree
2017 Dec 14
0
Distributions for gbm models
On page 409 of "Applied Predictive Modeling" by Max Kuhn, it states
that the gbm function can accomodate only two class problems when
referring to the distribution parameter.
>From gbm help re: the distribution parameter:
Currently available options are "gaussian" (squared error),
"laplace" (absolute loss), "tdist" (t-distribution
2009 Apr 14
3
Problem cross-compiling on Ubuntu
I'm using Ubuntu 8.10 (Intrepid Ibex) and R 2.7.1.
I've built a package from source (a modified version of gbm) and it
contains some C++ code. I now want to cross-compile it to get a
Windows version.
I installed R using
sudo apt-get update
sudo apt-get install r-base
sudo apt-get install r-base-dev
So far as I can tell, I've also followed all the instructions in the
guide
2008 Sep 18
1
caret package: arguments passed to the classification or regression routine
Hi,
I am having problems passing arguments to method="gbm" using the train()
function.
I would like to train gbm using the laplace distribution or the quantile
distribution.
here is the code I used and the error:
gbm.test <- train(x.enet, y.matrix[,7],
method="gbm",
distribution=list(name="quantile",alpha=0.5), verbose=FALSE,
2005 Apr 25
1
Failed to install gbm_1.4-2 (PR#7814)
Full_Name: The Manager
Version: 2.0.1
OS: Solaris 9
Submission from: (NULL) (129.67.80.243)
> install.packages("gbm")
trying URL `http://cran.uk.r-project.org/src/contrib/PACKAGES'
Content type `text/plain; charset=ISO-8859-1' length 52975 bytes
opened URL
==================================================
downloaded 51Kb
trying URL
2014 Jul 02
0
How do I call a C++ function (for k-means) within R?
I am trying to call a C++ k-means function within R and I am struggling. I
know that the below code is used to call a C++ function for gbm but how do I
do it for k-means?
gbm.obj <- .Call("gbm",
Y=as.double(y),
Offset=as.double(offset),
X=as.double(x),
X.order=as.integer(x.order),
2012 Jul 23
1
mboost vs gbm
I'm attempting to fit boosted regression trees to a censored response using
IPCW weighting. I've implemented this through two libraries, mboost and
gbm, which I believe should yield models that would perform comparably.
This, however, is not the case - mboost performs much better. This seems
odd. This issue is meaningful since the output of this regression needs to
be implemented in a
2005 Feb 18
2
gbm
Hi, there:
I am always experiencing the scalability of some R packages. This
time, I am trying gbm to do adaboosting on my project. Initially I
tried to grow trees by using rpart on a dataset with 200 variables and
30,000 observations. Now, I am thinking if I can apply adaboosting on
it.
I am wondering if here is anyone who did a similar thing before and
can provide some sample codes. Also any
2002 Jun 23
2
AdaBoost for R
I'm going to implement AdaBoost algorithm in R. Just wanted to ensure
that there is no implementation of any boosting algorithm in R... don't
want
to reinvent the wheel...
-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-
r-devel mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html
Send "info", "help", or
2012 Sep 16
2
Where is the R configuration file or how to override R compilers
I have a question about how one can modify or override the compilers
that R uses for package installations? Or if perhaps this configuration
is in some editable file somewhere.
Initially I built the version of R 2.15.1 on Solaris SPARC (virtual T4),
but found out the build was done as 32 bit. After some research, I
found that the pre-compiled GCC version I had only allowed for 32 bit.
I wanted
2003 Jul 14
0
package announcement: Generalized Boosted Models (gbm)
Generalized Boosted Models (gbm)
This package implements extensions to Y. Freund and R. Schapire's AdaBoost
algorithm and J. Friedman's gradient boosting machine (aka multivariate
adaptive regression trees, MART). It includes regression methods for least
squares, absolute loss, logistic, Poisson, Cox proportional hazards/partial
likelihood, and the AdaBoost exponential loss. It handles
2003 Jul 14
0
package announcement: Generalized Boosted Models (gbm)
Generalized Boosted Models (gbm)
This package implements extensions to Y. Freund and R. Schapire's AdaBoost
algorithm and J. Friedman's gradient boosting machine (aka multivariate
adaptive regression trees, MART). It includes regression methods for least
squares, absolute loss, logistic, Poisson, Cox proportional hazards/partial
likelihood, and the AdaBoost exponential loss. It handles
2008 Sep 22
1
gbm error
Good afternoon
Has anyone tried using Dr. Elith's BRT script? I cannot seem to run
gbm.step from the installed gbm package. Is it something external to gbm?
When I run the script itself
<- gbm.step(data=model.data,
gbm.x = colx:coly,
gbm.y = colz,
family = "bernoulli",
tree.complexity = 5,
learning.rate = 0.01,
bag.fraction = 0.5)
... I