similar to: adaboost more two classes

Displaying 20 results from an estimated 50000 matches similar to: "adaboost more two classes"

2006 May 27
2
boosting - second posting
Hi I am using boosting for a classification and prediction problem. For some reason it is giving me an outcome that doesn't fall between 0 and 1 for the predictions. I have tried type="response" but it made no difference. Can anyone see what I am doing wrong? Screen output shown below: > boost.model <- gbm(as.factor(train$simNuance) ~ ., # formula +
2006 May 25
0
boosting
Hi I am using boosting for a classification and prediction problem. For some reason it is giving me an outcome that doesn't fall between 0 and 1 for the predictions. I have tried type="response" but it made no difference. Can anyone see what I am doing wrong? Screen output shown below: > boost.model <- gbm(as.factor(train$simNuance) ~ ., # formula +
2017 Dec 14
0
Distributions for gbm models
On page 409 of "Applied Predictive Modeling" by Max Kuhn, it states that the gbm function can accomodate only two class problems when referring to the distribution parameter. >From gbm help re: the distribution parameter: Currently available options are "gaussian" (squared error), "laplace" (absolute loss), "tdist" (t-distribution
2003 Jul 14
0
package announcement: Generalized Boosted Models (gbm)
Generalized Boosted Models (gbm) This package implements extensions to Y. Freund and R. Schapire's AdaBoost algorithm and J. Friedman's gradient boosting machine (aka multivariate adaptive regression trees, MART). It includes regression methods for least squares, absolute loss, logistic, Poisson, Cox proportional hazards/partial likelihood, and the AdaBoost exponential loss. It handles
2003 Jul 14
0
package announcement: Generalized Boosted Models (gbm)
Generalized Boosted Models (gbm) This package implements extensions to Y. Freund and R. Schapire's AdaBoost algorithm and J. Friedman's gradient boosting machine (aka multivariate adaptive regression trees, MART). It includes regression methods for least squares, absolute loss, logistic, Poisson, Cox proportional hazards/partial likelihood, and the AdaBoost exponential loss. It handles
2009 Aug 26
0
Doubt about adaboost
Hello, I performed a boosting analisis with adabag package to obtain a classification tree with the following set of commands: Tesis.boost <- adaboost.M1(Captura~., data=Tesis2, mfinal=2) > arb<-Tesis.boost$tree[[1]] > post(arb, file ="") > post(arb, file ="",title= "Arbol 1") I would like to know the meanning of the numbers that appeared in the
2009 Apr 07
0
gbm for multi-class problems
Dear List, I´m working on a classification problem. My response has 60 levels. I`m very interested in boosted trees like AdaBoost or gradient boosting machine as implemented in the package "gbm". Unfortunately gbm is only applicable for 2-class problems. Is anybody out there who can help me? Is there a way to use gbm() for multi-class problems? Maybe there is a way to transform my
2005 May 18
3
How to convert array to c()
Dear R-helper, Is there possible to make this array: > a <- array(1:12, c(4, 3)) > a [,1] [,2] [,3] [1,] 1 5 9 [2,] 2 6 10 [3,] 3 7 11 [4,] 4 8 12 > like: c(1,5,9) c(2,6,10) c(3,7,11) c(4,8,12) Thank you very much in advance. Regards, Muhammad Subianto
2012 Sep 16
2
Where is the R configuration file or how to override R compilers
I have a question about how one can modify or override the compilers that R uses for package installations? Or if perhaps this configuration is in some editable file somewhere. Initially I built the version of R 2.15.1 on Solaris SPARC (virtual T4), but found out the build was done as 32 bit. After some research, I found that the pre-compiled GCC version I had only allowed for 32 bit. I wanted
2006 Aug 24
3
How to compare rows of two matrices
Dear all, I have a dataset train <- cbind(c(0,2,2,1,0), c(8,9,4,0,2), 6:10, c(-1, 1, 1, -1, 1)) test <- cbind(1:5, c(0,1,5,1,3), c(1,1,2,0,3) ,c(1, 1, -1, 1, 1)) I want to find which rows of train and test it different in its last column (column 4). The solution must be something like train [,1] [,2] [,3] [,4] [1,] 0 8 6 -1 [3,] 2 4 8 1 [4,] 1 0 9 -1
2006 Apr 11
2
About list to list - thanks
Thank you very much for your useful suggestions. These are exactly what I was looking for. foo <- list(foo1, foo2, foo3) lapply(foo, function(x) matrix(unlist(x), nrow = length(x), byrow = TRUE)) or lapply(foo, function(x) do.call('rbind', x)) Best, Muhammad Subianto On 4/11/06, Muhammad Subianto <msubianto at gmail.com> wrote: > Dear all, > I have a result my experiment
2005 Jul 12
1
SOS Boosting
Hi, I am trying to implement the Adaboost.M1. algorithm as described in "The Elements of Statistical Learning" p.301 I don't use Dtettling 's library "boost" because : - I don't understande the difference beetween Logitboost and L2boost - I 'd like to use larger trees than stumps. By using option weights set to (1/n, 1/n, ..., 1/n) in rpart or tree
2003 Sep 30
2
Remove comma (,) in data set
Dear R-helper, I am new learning R. Now, I have a data set like: 24,2,3,3,1,1,2,3,0,1 45,1,3,10,1,1,3,4,0,1 43,2,3,7,1,1,3,4,0,1 42,3,2,9,1,1,3,3,0,1 36,3,3,8,1,1,3,2,0,1 19,4,4,0,1,1,3,3,0,1 38,2,3,6,1,1,3,2,0,1 21,3,3,1,1,0,3,2,0,1 27,2,3,3,1,1,3,4,0,1 45,1,1,8,1,1,2,2,1,1 ... with 3730 rows I want to remove comma (,) in data set. The result like: 24 2 3 3 1 1 2 3 0 1 45 1 3 10 1 1 3 4 0 1
2002 Jun 23
2
AdaBoost for R
I'm going to implement AdaBoost algorithm in R. Just wanted to ensure that there is no implementation of any boosting algorithm in R... don't want to reinvent the wheel... -.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.- r-devel mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html Send "info", "help", or
2009 Apr 14
3
Problem cross-compiling on Ubuntu
I'm using Ubuntu 8.10 (Intrepid Ibex) and R 2.7.1. I've built a package from source (a modified version of gbm) and it contains some C++ code. I now want to cross-compile it to get a Windows version. I installed R using sudo apt-get update sudo apt-get install r-base sudo apt-get install r-base-dev So far as I can tell, I've also followed all the instructions in the guide
2005 Sep 16
1
How to make two figures in one plot - package vcd
Dear all, I have a problem to make figures with two columns in package vcd. Here an example code I take from "\library\vcd\html\plot.loglm.html" What I need, I want to make two figures in one plot. How could I do that. I have tried with layout(rbind(c(1, 1, 2, 2))) but the same result, two plot. Best wishes, Muhammad Subianto library(vcd) oldpar <- par(mfrow=c(1, 2)) ## mosaic
2005 Apr 25
1
Failed to install gbm_1.4-2 (PR#7814)
Full_Name: The Manager Version: 2.0.1 OS: Solaris 9 Submission from: (NULL) (129.67.80.243) > install.packages("gbm") trying URL `http://cran.uk.r-project.org/src/contrib/PACKAGES' Content type `text/plain; charset=ISO-8859-1' length 52975 bytes opened URL ================================================== downloaded 51Kb trying URL
2006 Sep 03
3
Merge list to list - as list
Dear all, #Last week, I asked about merge x and y as list. #Now I have a dataset with list of list like: x <- list(list(matrix(1:20, 5, 4),matrix(1:20, 5, 4)), list(matrix(1:20, 5, 4),matrix(1:20, 5, 4))) y <- list(list(c(1, -1, -1, 1, 1),c(1, 1, -1, -1, -1)), list(c(1, 1, 1, 1, 1),c(1, 1, -1, 1, -1))) x y #I need merge x and y, I have tried with list.uni <-
2006 Aug 06
1
Take random sample from class variable
Dear all, Suppose I have a dataset like below, then I take for example, 100 random sample "class" variable where contains "yes" and "no" respectively, 70% and 30%. I need a new 100 random sample from mydat dataset, but I can't get the result. Thanks you very much for any helps. Best, Muhammad Subianto mydat <- data.frame(size=c(30,12,15,10,12,12,25,30,20,14),
2012 Oct 14
1
Is there any R package that contains Rusboost based on Adaboost.m2?
Hi, I have been searching everywhere for an implementation of those algorithms, but I have only observed them in Matlab and on the literature. I noticed a package called 'ada' in CRAN but it is not for multi class. I would be happy with just Adaboost.m2, Smoteboost over adaboost.m2 or any other combination that could account for imbalanced multiclass classification problems. Thanks!