similar to: help! Error in using Boosting...

Displaying 20 results from an estimated 1000 matches similar to: "help! Error in using Boosting..."

2006 May 27
2
boosting - second posting
Hi I am using boosting for a classification and prediction problem. For some reason it is giving me an outcome that doesn't fall between 0 and 1 for the predictions. I have tried type="response" but it made no difference. Can anyone see what I am doing wrong? Screen output shown below: > boost.model <- gbm(as.factor(train$simNuance) ~ ., # formula +
2009 Jun 17
1
gbm for cost-sensitive binary classification?
I recently use gbm for a binary classification problem. As expected, it gets very good results, based on Area under ROC with 7-fold cross validation. However, the application (malware detection) is cost-sensitive, getting a FP (classify a clean sample as a dirty one) is much worse than getting a FN (miss a dirty sample). I would like to tune the gbm model biased to very low FP rate. For this
2010 Apr 26
3
R.GBM package
HI, Dear Greg, I AM A NEW to GBM package. Can boosting decision tree be implemented in 'gbm' package? Or 'gbm' can only be used for regression? IF can, DO I need to combine the rpart and gbm command? Thanks so much! -- Sincerely, Changbin -- [[alternative HTML version deleted]]
2013 Jun 23
1
Which is the final model for a Boosted Regression Trees (GBM)?
Hi R User, I was trying to find a final model in the following example by using the Boosted regression trees (GBM). The program gives the fitted values but I wanted to calculate the fitted value by hand to understand in depth. Would you give moe some hints on what is the final model for this example? Thanks KG ------- The following script I used #----------------------- library(dismo)
2010 Feb 28
1
Gradient Boosting Trees with correlated predictors in gbm
Dear R users, I’m trying to understand how correlated predictors impact the Relative Importance measure in Stochastic Boosting Trees (J. Friedman). As Friedman described “ …with single decision trees (referring to Brieman’s CART algorithm), the relative importance measure is augmented by a strategy involving surrogate splits intended to uncover the masking of influential variables by others
2006 May 25
0
boosting
Hi I am using boosting for a classification and prediction problem. For some reason it is giving me an outcome that doesn't fall between 0 and 1 for the predictions. I have tried type="response" but it made no difference. Can anyone see what I am doing wrong? Screen output shown below: > boost.model <- gbm(as.factor(train$simNuance) ~ ., # formula +
2013 Mar 24
3
Parallelizing GBM
Dear All, I am far from being a guru about parallel programming. Most of the time, I rely or randomForest for data mining large datasets. I would like to give a try also to the gradient boosted methods in GBM, but I have a need for parallelization. I normally rely on gbm.fit for speed reasons, and I usually call it this way gbm_model <- gbm.fit(trainRF,prices_train, offset = NULL, misc =
2014 Jul 02
0
How do I call a C++ function (for k-means) within R?
I am trying to call a C++ k-means function within R and I am struggling. I know that the below code is used to call a C++ function for gbm but how do I do it for k-means? gbm.obj <- .Call("gbm", Y=as.double(y), Offset=as.double(offset), X=as.double(x), X.order=as.integer(x.order),
2017 Dec 14
0
Distributions for gbm models
On page 409 of "Applied Predictive Modeling" by Max Kuhn, it states that the gbm function can accomodate only two class problems when referring to the distribution parameter. >From gbm help re: the distribution parameter: Currently available options are "gaussian" (squared error), "laplace" (absolute loss), "tdist" (t-distribution
2008 Mar 05
0
Using tune with gbm --grid search for best hyperparameters
Hello LIST, I'd like to use tune from e1071 to do a grid search for hyperparameter values in gbm. However, I can not get this to work. I note that there is no wrapper for gbm but that it is possible to use non-wrapped functions (like lm) without problem. Here's a snippet of code to illustrate. > data(mtcars) obj <- >
2008 Sep 18
1
caret package: arguments passed to the classification or regression routine
Hi, I am having problems passing arguments to method="gbm" using the train() function. I would like to train gbm using the laplace distribution or the quantile distribution. here is the code I used and the error: gbm.test <- train(x.enet, y.matrix[,7], method="gbm", distribution=list(name="quantile",alpha=0.5), verbose=FALSE,
2008 Sep 22
1
gbm error
Good afternoon Has anyone tried using Dr. Elith's BRT script? I cannot seem to run gbm.step from the installed gbm package. Is it something external to gbm? When I run the script itself <- gbm.step(data=model.data, gbm.x = colx:coly, gbm.y = colz, family = "bernoulli", tree.complexity = 5, learning.rate = 0.01, bag.fraction = 0.5) ... I
2005 Apr 25
1
Failed to install gbm_1.4-2 (PR#7814)
Full_Name: The Manager Version: 2.0.1 OS: Solaris 9 Submission from: (NULL) (129.67.80.243) > install.packages("gbm") trying URL `http://cran.uk.r-project.org/src/contrib/PACKAGES' Content type `text/plain; charset=ISO-8859-1' length 52975 bytes opened URL ================================================== downloaded 51Kb trying URL
2009 Jul 14
2
SOS! error in GLM logistic regression...
Hi all, Could anybody tell me what happened to my logistic regression in R? mylog=glm(mytraindata$V1 ~ ., data=mytraindata, family=binomial("logit")) It generated the following error message: Error in model.frame.default(Terms, newdata, na.action = na.action, xlev = object$xlevels) : factor 'state1' has new level(s) AP Thank you!
2018 Feb 19
2
Gráficas 3D
Gracias Carlos, mi idea es construir un cono, un cilindro u otros cuerpos geométrico y luego graficarlos. Alguna idea de como empezar? Muchas gracias como siempre El lun., 19 de feb. de 2018 15:06, <r-help-es-request en r-project.org> escribió: > Envíe los mensajes para la lista R-help-es a > r-help-es en r-project.org > > Para subscribirse o anular su subscripción a
2018 Feb 19
3
gbm.step para clasificación no binaria
Hola de nuevo. Se me olvidaba la principal razón para utilizar gbm.step del paquete dismo. Como sabéis, los boosted si sobreajustan (a diferencia de los random forest o cualquier otro bootstrap) pero gbm.step hace validación cruzada para determinar el nº óptimo de árboles y evitarlo. Es fundamental. La opción que me queda, Carlos, es hacerlo con gbm, pero muchas veces, y usar el
2009 Jul 07
2
Question in using e1071 svm routine
Hi all, I've got the following error message in using e1071 svm routine... Could anybody please help me? Thank you! --------------------------------- model <- svm(y=factor(mytraindata[, 1]), x=mytraindata[, -1], probability=T) Error in if (any(co)) { : missing value where TRUE/FALSE needed In addition: Warning message: In FUN(newX[, i], ...) : NAs introduced by coercion
2018 Feb 19
3
gbm.step para clasificación no binaria
Gracias Carlos. Hasta donde yo entiendo si las hay: El argumento family puede ser: "gaussian" (for minimizing squared error); por lo que tiene que ser numérica "bernoulli" (logistic regression for 0-1 out-comes); binaria por narices "poisson" (count outcomes; requires the response to be a positive integer); numérica también, pues. La única podría ser
2010 Jul 10
4
eliminating constant variables
Hi all, I have a large data set and want to immediately build a 'blind' model without first examining the data. Now it appears in the data there are a lot of fields that are constant or all missing values - which prevents the model from being built. Can someone point me the right direction as to how I can automatically purge my data file of these useless fields. Thanks in advance, pdb
2012 Jul 23
1
mboost vs gbm
I'm attempting to fit boosted regression trees to a censored response using IPCW weighting. I've implemented this through two libraries, mboost and gbm, which I believe should yield models that would perform comparably. This, however, is not the case - mboost performs much better. This seems odd. This issue is meaningful since the output of this regression needs to be implemented in a