similar to: gbm

Displaying 20 results from an estimated 10000 matches similar to: "gbm"

2010 Apr 26
3
R.GBM package
HI, Dear Greg, I AM A NEW to GBM package. Can boosting decision tree be implemented in 'gbm' package? Or 'gbm' can only be used for regression? IF can, DO I need to combine the rpart and gbm command? Thanks so much! -- Sincerely, Changbin -- [[alternative HTML version deleted]]
2010 Feb 28
1
Gradient Boosting Trees with correlated predictors in gbm
Dear R users, I’m trying to understand how correlated predictors impact the Relative Importance measure in Stochastic Boosting Trees (J. Friedman). As Friedman described “ …with single decision trees (referring to Brieman’s CART algorithm), the relative importance measure is augmented by a strategy involving surrogate splits intended to uncover the masking of influential variables by others
2013 Mar 24
3
Parallelizing GBM
Dear All, I am far from being a guru about parallel programming. Most of the time, I rely or randomForest for data mining large datasets. I would like to give a try also to the gradient boosted methods in GBM, but I have a need for parallelization. I normally rely on gbm.fit for speed reasons, and I usually call it this way gbm_model <- gbm.fit(trainRF,prices_train, offset = NULL, misc =
2007 Apr 03
1
treenet
Hi, Anybody used treenet here? I downloaded a demo but don't know how to start with. Does R has something like treenet? Thanks, [[alternative HTML version deleted]]
2010 May 01
1
bag.fraction in gbm package
Hi, Dear Greg, Sorry to bother you again. I have several questions about the 'gbm' package. if the train.fraction is less than 1 (ie. 0.5) , then the* first* 50% will be used to fit the model, the other 50% can be used to estimate the performance. if bag.fraction is 0.5, then gbm use the* random* 50% of the data to fit the model, and the other 50% data is used to estimate the
2011 May 24
1
gbm package: plotting a single tree
Hello, I'm not sure if Im posting this on the right place, my apologies if not. I'm using the package gbm to generate boosted trees models, and was wondering if there is a simple way of getting a graphical output for a single tree of the sequence. I know the function "pretty.gbm.tree" can be used to print information for a single tree, but I've been unable to find a way to
2005 Jul 07
2
randomForest
> From: Weiwei Shi > > it works. > thanks, > > but: (just curious) > why i tried previously and i got > > > is.vector(sample.size) > [1] TRUE Because a list is also a vector: > a <- c(list(1), list(2)) > a [[1]] [1] 1 [[2]] [1] 2 > is.vector(a) [1] TRUE > is.numeric(a) [1] FALSE Actually, the way I initialize a list of known length is by
2018 Feb 19
3
gbm.step para clasificación no binaria
Hola de nuevo. Se me olvidaba la principal razón para utilizar gbm.step del paquete dismo. Como sabéis, los boosted si sobreajustan (a diferencia de los random forest o cualquier otro bootstrap) pero gbm.step hace validación cruzada para determinar el nº óptimo de árboles y evitarlo. Es fundamental. La opción que me queda, Carlos, es hacerlo con gbm, pero muchas veces, y usar el
2005 Jan 25
3
multi-class classification using rpart
Hi, I am trying to make a multi-class classification tree by using rpart. I used MASS package'd data: fgl to test and it works well. However, when I used my small-sampled data as below, the program seems to take forever. I am not sure if it is due to slowness or there is something wrong with my codes or data manipulation. Please be advised ! The data is described as the output from str()
2005 Feb 18
2
gbm
Hi, there: I am always experiencing the scalability of some R packages. This time, I am trying gbm to do adaboosting on my project. Initially I tried to grow trees by using rpart on a dataset with 200 variables and 30,000 observations. Now, I am thinking if I can apply adaboosting on it. I am wondering if here is anyone who did a similar thing before and can provide some sample codes. Also any
2010 Jun 15
1
output from the gbm package
HI, Dear Greg and R community, I have one question about the output of gbm package. the output of Boosting should be f(x), from it , how to calculate the probability for each observations in data set? SInce it is stochastic, how can guarantee that each observation in training data are selected at least once? IF SOME obs are not selected, how to calculate the training error? Thanks? --
2007 Jan 04
3
randomForest and missing data
Does anyone know a reason why, in principle, a call to randomForest cannot accept a data frame with missing predictor values? If each individual tree is built using CART, then it seems like this should be possible. (I understand that one may impute missing values using rfImpute or some other method, but I would like to avoid doing that.) If this functionality were available, then when the trees
2003 Jul 23
3
Boosting, bagging and bumping. Questions about R tools and predictions.
I'm interested in further understanding the differences in using many classification trees to improve classification rates. I'm also interested in finding out what I can do in R and which methods will allow prediction. Can anybody point me to a citation or discussion? Specifically, I want to classify remotely sensed imagery where training data is extracted on class membership by the user.
2018 Feb 19
3
gbm.step para clasificación no binaria
Gracias Carlos. Hasta donde yo entiendo si las hay: El argumento family puede ser: "gaussian" (for minimizing squared error); por lo que tiene que ser numérica "bernoulli" (logistic regression for 0-1 out-comes); binaria por narices "poisson" (count outcomes; requires the response to be a positive integer); numérica también, pues. La única podría ser
2007 Nov 26
1
anyway to force rpart() to include a specific predictor
If I understand correctly, rpart() will pick predictor at each node automatically. I am wondering if there is a way to force rpart() including a specific predictor. The reason I am asking is that I'd like to use rpart() to detect interaction terms for some variables. Thanks.
2006 Dec 28
3
CV by rpart/mvpart
Dear R-list, I am using the rpart/mvpart-package for selecting a right-sized regression tree by 10-fold cross-validation. My question: Is there a possibility to find out for every observation in which of the ten folds it is lying? I want to use the same folds for validating another regression method (moving averages) in order to choose the better one. Thanks a lot, Pedro
2008 Sep 22
1
gbm error
Good afternoon Has anyone tried using Dr. Elith's BRT script? I cannot seem to run gbm.step from the installed gbm package. Is it something external to gbm? When I run the script itself <- gbm.step(data=model.data, gbm.x = colx:coly, gbm.y = colz, family = "bernoulli", tree.complexity = 5, learning.rate = 0.01, bag.fraction = 0.5) ... I
2013 Jun 23
1
Which is the final model for a Boosted Regression Trees (GBM)?
Hi R User, I was trying to find a final model in the following example by using the Boosted regression trees (GBM). The program gives the fitted values but I wanted to calculate the fitted value by hand to understand in depth. Would you give moe some hints on what is the final model for this example? Thanks KG ------- The following script I used #----------------------- library(dismo)
2007 Apr 23
6
Random Forest
Hi, I am trying to print out my confusion matrix after having created my random forest. I have put in this command: fit<-randomForest(MMS_ENABLED_HANDSET~.,data=dat,ntree=500,mtry=14, na.action=na.omit,confusion=TRUE) but I can't get it to give me the confusion matrix, anyone know how this works? Thansk! Ruben [[alternative HTML version deleted]]
2010 Jun 23
1
gbm function
 Hello   I have questions about gbm package.  It seems we have to devide data to two part (training set and test set) for first.   1- trainig set for running of gbm function 2- test set for gbm.perf      is it rigth? I have 123 sample that I devided 100 for trainig and 23 for test.   So, parameter of cv.folds in gbm function is for what?   Thanks alot Azam       [[alternative HTML