similar to: gbm package: plotting a single tree

Displaying 20 results from an estimated 6000 matches similar to: "gbm package: plotting a single tree"

2010 Apr 26
3
R.GBM package
HI, Dear Greg, I AM A NEW to GBM package. Can boosting decision tree be implemented in 'gbm' package? Or 'gbm' can only be used for regression? IF can, DO I need to combine the rpart and gbm command? Thanks so much! -- Sincerely, Changbin -- [[alternative HTML version deleted]]
2005 Jan 12
4
gbm
Hi, there: I am wondering if I can find some detailed explanation on gbm or explanation on examples of gbm. thanks, Ed
2013 Jun 23
1
Which is the final model for a Boosted Regression Trees (GBM)?
Hi R User, I was trying to find a final model in the following example by using the Boosted regression trees (GBM). The program gives the fitted values but I wanted to calculate the fitted value by hand to understand in depth. Would you give moe some hints on what is the final model for this example? Thanks KG ------- The following script I used #----------------------- library(dismo)
2013 Mar 24
3
Parallelizing GBM
Dear All, I am far from being a guru about parallel programming. Most of the time, I rely or randomForest for data mining large datasets. I would like to give a try also to the gradient boosted methods in GBM, but I have a need for parallelization. I normally rely on gbm.fit for speed reasons, and I usually call it this way gbm_model <- gbm.fit(trainRF,prices_train, offset = NULL, misc =
2018 Feb 19
3
gbm.step para clasificación no binaria
Hola de nuevo. Se me olvidaba la principal razón para utilizar gbm.step del paquete dismo. Como sabéis, los boosted si sobreajustan (a diferencia de los random forest o cualquier otro bootstrap) pero gbm.step hace validación cruzada para determinar el nº óptimo de árboles y evitarlo. Es fundamental. La opción que me queda, Carlos, es hacerlo con gbm, pero muchas veces, y usar el
2005 Feb 18
2
gbm
Hi, there: I am always experiencing the scalability of some R packages. This time, I am trying gbm to do adaboosting on my project. Initially I tried to grow trees by using rpart on a dataset with 200 variables and 30,000 observations. Now, I am thinking if I can apply adaboosting on it. I am wondering if here is anyone who did a similar thing before and can provide some sample codes. Also any
2010 Jun 23
1
gbm function
 Hello   I have questions about gbm package.  It seems we have to devide data to two part (training set and test set) for first.   1- trainig set for running of gbm function 2- test set for gbm.perf      is it rigth? I have 123 sample that I devided 100 for trainig and 23 for test.   So, parameter of cv.folds in gbm function is for what?   Thanks alot Azam       [[alternative HTML
2012 Jul 23
1
mboost vs gbm
I'm attempting to fit boosted regression trees to a censored response using IPCW weighting. I've implemented this through two libraries, mboost and gbm, which I believe should yield models that would perform comparably. This, however, is not the case - mboost performs much better. This seems odd. This issue is meaningful since the output of this regression needs to be implemented in a
2018 Feb 19
3
gbm.step para clasificación no binaria
Gracias Carlos. Hasta donde yo entiendo si las hay: El argumento family puede ser: "gaussian" (for minimizing squared error); por lo que tiene que ser numérica "bernoulli" (logistic regression for 0-1 out-comes); binaria por narices "poisson" (count outcomes; requires the response to be a positive integer); numérica también, pues. La única podría ser
2008 Sep 22
1
gbm error
Good afternoon Has anyone tried using Dr. Elith's BRT script? I cannot seem to run gbm.step from the installed gbm package. Is it something external to gbm? When I run the script itself <- gbm.step(data=model.data, gbm.x = colx:coly, gbm.y = colz, family = "bernoulli", tree.complexity = 5, learning.rate = 0.01, bag.fraction = 0.5) ... I
2010 Sep 21
1
package gbm, predict.gbm with offset
Dear all, the help file for predict.gbm states that "The predictions from gbm do not include the offset term. The user may add the value of the offset to the predicted value if desired." I am just not sure how exactly, especially for a Poisson model, where I believe the offset is multiplicative ? For example: library(MASS) fit1 <- glm(Claims ~ District + Group + Age +
2009 Oct 30
1
possible memory leak in predict.gbm(), package gbm ?
Dear gbm users, When running predict.gbm() on a "large" dataset (150,000 rows, 300 columns, 500 trees), I notice that the memory used by R grows beyond reasonable limits. My 14GB of RAM are often not sufficient. I am interpreting this as a memory leak since there should be no reason to expand memory needs once the data are loaded and passed to predict.gbm() ? Running R version 2.9.2 on
2009 Jun 17
1
gbm for cost-sensitive binary classification?
I recently use gbm for a binary classification problem. As expected, it gets very good results, based on Area under ROC with 7-fold cross validation. However, the application (malware detection) is cost-sensitive, getting a FP (classify a clean sample as a dirty one) is much worse than getting a FN (miss a dirty sample). I would like to tune the gbm model biased to very low FP rate. For this
2012 Apr 25
1
Question about NV18 and GBM library.
Hi, I have a geforce 4mx 440 agp 8x, and I'm trying to use the GBM library, (as jbarnes in: http://virtuousgeek.org/blog/index.php/jbarnes/2011/10/ and David Hermann in KMSCON: https://github.com/dvdhrm/kmscon), without success. when I try to create a gbm_device, I get: (below the code.) nouveau_drm_screen_create: unknown chipset nv18 dri_init_screen_helper: failed to create pipe_screen
2012 Apr 16
1
Can't install package gbm, because packageVersion is not an exported object from namespace::Utils
I'm running R 2.11.1 on 64 bit Debian. I've had no problem installing any other CRAN packages, but installing package "gbm" fails due to: *** installing help indices ** building package indices ... ** testing if installed package can be loaded Error : .onAttach failed in attachNamespace() for 'gbm', details: call: NULL error: 'packageVersion' is not an
2010 May 21
1
Question regarding GBM package
Dear R expert I have come across the GBM package for R and it seemed appropriate for my research. I am trying to predict the number of FPGA resources required by a Software Function if it were mapped onto hardware. As input I use software metrics (a lot of them). I already use several regression techniques, and the graphs I produce with GBM look promising. Now my question... I see that the
2011 Feb 26
2
Reproducibility issue in gbm (32 vs 64 bit)
Dear List, The gbm package on Win 7 produces different results for the relative importance of input variables in R 32-bit relative to R 64-bit. Any idea why? Any idea which one is correct? Based on this example, it looks like the relative importance of 2 perfectly correlated predictors is "diluted" by half in 32-bit, whereas in 64-bit, one of these predictors gets all the importance
2010 Feb 28
1
Gradient Boosting Trees with correlated predictors in gbm
Dear R users, I’m trying to understand how correlated predictors impact the Relative Importance measure in Stochastic Boosting Trees (J. Friedman). As Friedman described “ …with single decision trees (referring to Brieman’s CART algorithm), the relative importance measure is augmented by a strategy involving surrogate splits intended to uncover the masking of influential variables by others
2003 Jul 14
0
package announcement: Generalized Boosted Models (gbm)
Generalized Boosted Models (gbm) This package implements extensions to Y. Freund and R. Schapire's AdaBoost algorithm and J. Friedman's gradient boosting machine (aka multivariate adaptive regression trees, MART). It includes regression methods for least squares, absolute loss, logistic, Poisson, Cox proportional hazards/partial likelihood, and the AdaBoost exponential loss. It handles
2003 Jul 14
0
package announcement: Generalized Boosted Models (gbm)
Generalized Boosted Models (gbm) This package implements extensions to Y. Freund and R. Schapire's AdaBoost algorithm and J. Friedman's gradient boosting machine (aka multivariate adaptive regression trees, MART). It includes regression methods for least squares, absolute loss, logistic, Poisson, Cox proportional hazards/partial likelihood, and the AdaBoost exponential loss. It handles