search for: svms

Displaying 20 results from an estimated 50 matches for "svms".

Did you mean: svm
2004 May 14
0
variable selection methods for SVMs
Hi, Has anyone implemented techniques for model selection in support vector machines in R? I have seen recursive feature wrappers and other methods in the literature, but I can't find implementations. Thanks in advance, Max Kuhn BD Diagnostic Systems ----------------------------------------- This message is intended only for the designated recipient(s...{{dropped}}
2012 Nov 16
1
polynomial svms and in-sample error
Hi again! This might be more of a statistical question, but anyway: If i train several support vector machines with different degrees of polynomials, and as result, get that higher degrees not only have a higher test error, but also a higher in-sample error, why is that? I would assume i should get an in-sample error lower or at least the same as the linear case. i'm worried i did
2006 Jan 27
1
Classifying Intertwined Spirals
...al basis: exp(-gamma*|u-v|^2). You should be able to see a PNG of the resulting plot here: http://www.flickr.com/photos/60118409 at N00/91835679/ The problem is that that's not good enough. I want a better fit. I think I can get one, I just don't know how. There's a paper on Proximal SVMs that claims a better result. To the best of my knowledge, PSVMs should not outperform SVMs, they are merely faster to compute. You can find the paper (with the picture of their SVM) on citeseer: http://citeseer.ifi.unizh.ch/cachedpage/515368/5 @misc{ fung-proximal, author = "G. Fung and O. M...
2012 Aug 19
1
e1071 - tuning is not giving the best within the range
Hi everybody, I am new in e1071 and with SVMs. I am trying to understand the performance of SVMs but I face with a situation that I thought as not meaningful. I added the R code for you to see what I have done. /set.seed(1234) data <- data.frame( rbind(matrix(rnorm(1500, mean = 10, sd = 5),ncol = 10), matrix(rnorm(1500, mean = 5, sd = 5...
2010 Oct 25
1
online course: SVM in R with Lutz Hamel at statistics.com
Support vector machines (SVMs) have established themselves as one of the preeminent machine learning models for classification and regression over the past decade or so, frequently outperforming artificial neural networks in task such as text mining and bioinformatics. Dr. Lutz Hamel, author of "Knowledge Discovery with S...
2005 Jun 28
2
svm and scaling input
Dear All, I've a question about scaling the input variables for an analysis with svm (package e1071). Most of my variables are factors with 4 to 6 levels but there are also some numeric variables. I'm not familiar with the math behind svms, so my assumtions maybe completely wrong ... or obvious. Will the svm automatically expand the factors into a binary matrix? If I add numeric variables outside the range of 0 to 1 do I have to scale them to have 0 to 1 range? thanks a lot for help, +kind regards, Arne
2003 Jan 31
1
svm regression in R
Hallo, I have a question concerning SVM regression in R. I intend to use SVMs for feature selection (and knowledge discovery). For this purpose I will need to extract the weights that are associated with my features. I understand from a previous thread on SVM classification, that predictive models can be derived from SVs, coefficiants and rhos, but it is unclear for me how t...
2012 Nov 02
1
An idea: Extend mclapply's mc.set.seed with an initial seed value?
...ld be keyed off depending whether mc.set.seed is logical, preserving the current behaviour, or numerical, using the value in a call to set.seed. Does this make sense? If you wonder how I came up with the idea: I spent a couple of hours debugging "unstable" results from parallel tuning of svms, which was caused by the parallel execution. In my case I can simply do the set.seed in the FUN argument function, but that may not be always the case. Ivan [[alternative HTML version deleted]]
2005 Apr 04
3
Error in save.image(): image could not be renamed
Hello, I am doing intensive tests on SVMs parameter selection. Once a while I got the error: Error in save.image(): image could not be renamed and is left in .RDataTmp1 I cannot use the information saves in .RDataTmp1. When that happens I loose several hours of tests. It happens, ussualy when the computer is locked, i.e., there is not oth...
2015 Dec 10
3
SVM hadoop
...lado, los trucos habituales para hacer algo "parecido" a SVM o "basado" en SVM pero que no sea SVM. Si es que eso te vale. Puedes probar a hacerlo con mllib (sobre Spark), como aquí <http://spark.apache.org/docs/latest/mllib-linear-methods.html#linear-support-vector-machines-svms>. ¡Pero no lo he probado nunca! Un saludo, Carlos J. Gil Bellosta http://www.datanalytics.com El 9 de diciembre de 2015, 13:15, MªLuz Morales <mlzmrls en gmail.com> escribió: > Buenos días, > > alguien sabe si hay alguna manera de implementar una máquina de soporte > vector...
2015 Dec 09
2
SVM hadoop
Buenos días, alguien sabe si hay alguna manera de implementar una máquina de soporte vectorial (svm) con R-hadoop?? Mi interés es hacer procesamiento big data con svm. Se que en R, existen los paquetes {RtextTools} y {e1071} que permiten hacer svm. Pero no estoy segura de que el algoritmo sea paralelizable, es decir, que pueda correr en paralelo a través de la plataforma R-hadoop. Muchas
2011 Jan 07
2
Stepwise SVM Variable selection
...curacy isn't great. I used a grid search over the C and G parameters with an RBF kernel to find the best settings. I remember that for least squares, R has a nice stepwise function that will try combining subsets of variables to find the optimal result. Clearly, this doesn't exist for SVMs as a built in function. As an experiment, I simply grabbed the first 50 variables and repeated the training/grid search procedure. The results were significantly better. Since the date is VERY noisy, my guess is that eliminating some of the variables eliminated some noise that resulted in bet...
2007 Apr 09
1
Could not fit correct values in discriminant analysis by bruto.
...ur data are linearly separable. To see this (if you didn't already know), try library(lattice) splom(~x, groups = y) and look at the first row. If you are trying to do classification, there are a few methods that would choke on this (logistic regression) and a few that won't (trees, svms etc). I would guess that bruto is in the latter group. However, if you are try to do classification, try using bruto via fda: > tmp <- cbind(x, factor(y)) > > fdaFit <- fda(y2~., tmp) > fdaFit Call: fda(formula = y2 ~ ., data = tmp) Dimension: 1...
2015 Dec 10
2
SVM hadoop
...le. >>> > >>> > >>> > >>> > Puedes probar a hacerlo con mllib (sobre Spark), como aquí >>> > >>> > < >>> > >>> http://spark.apache.org/docs/latest/mllib-linear-methods.html#linear-support-vector-machines-svms >>> > >. >>> > >>> > ¡Pero no lo he probado nunca! >>> > >>> > >>> > >>> > Un saludo, >>> > >>> > >>> > >>> > Carlos J. Gil Bellosta >>> > >>>...
2005 Jan 14
0
2nd Workshop "Ensemble Methods", Tuebingen (Germany)
...forests or support vector machines. The workshop tries to bring together the machine learning view with the statistical view on ensemble methods. By now, the following talks have been scheduled: * Gilles Blanchard: Consistency results for Boosting * Peter Buehlmann: tba * Gunnar Raetsch: Boosting SVMs: a new way for multiple kernel learning * Koji Tsuda: Matrix exponentiated Gradients Participants can attend the workshop free of charge, the number of participants is restricted to 30 persons. For registration and submission of abstracts, please send an email message to Gunnar.Raetsch at tuebing...
2008 Apr 09
0
How do I get the parameters out of e1071's svm?
...ave 4 independent variables, I'd expect to have four coefficients plus a threshold, with 4 total degrees of freedom. But the only numeric vectors of length 4 in the result are the scaling and center, and those are done before the fitting so each one has zero mean and unit variance. I know svms don't need to put every point through the kernel function, and can even handle infinite dimensional kernels. But don't they need to compute the coefficients? Best, Martin
2009 Dec 04
0
Problems while plotting with ROCR
...<- prediction(hiv.svm$predictions, hiv.svm$labels) perf.svm <- performance(pred.svm, 'tpr', 'fpr') pred.nn <- prediction(hiv.nn$predictions, hiv.svm$labels) perf.nn <- performance(pred.nn, 'tpr', 'fpr') plot(perf.svm, lty=3, col="red",main="SVMs and NNs for prediction of HIV-1 coreceptor usage") now I see some dotted red ROC curves. Now after the next command: plot(perf.nn, lty=3, col="blue",add=TRUE) there happens exactly the same as if i would call: plot(perf.nn, lty=3, col="blue") I now see only some dotted b...
2006 Jan 14
0
Data Mining Course
...ier courses are not a prerequisite for this new course. In this course we emphasize the tools useful for tackling modern-day data analysis problems. We focus on both "tall" data ( N>p where N=#cases, p=#features) and "wide" data (p>N). The tools include gradient boosting, SVMs and kernel methods, random forests, lasso and LARS, ridge regression and GAMs, supervised principal components, and cross-validation. We also present some interesting case studies in a variety of application areas. All our examples are developed using the S language, and most of the procedures we...
2006 Mar 07
0
Statistical Learning and Datamining Course
...ier courses are not a prerequisite for this new course. In this course we emphasize the tools useful for tackling modern-day data analysis problems. We focus on both "tall" data ( N>p where N=#cases, p=#features) and "wide" data (p>N). The tools include gradient boosting, SVMs and kernel methods, random forests, lasso and LARS, ridge regression and GAMs, supervised principal components, and cross-validation. We also present some interesting case studies in a variety of application areas. All our examples are developed using the S language, and most of the procedures we...
2009 May 04
1
Caret package: coeffcients for regression
Dear All, I am using "Caret"package for SVM regression and elastic net regression . I can get the final fiited vs observed values. How can I get the coefficients? Any ideas? Thanks Alex [[alternative HTML version deleted]]