similar to: tune()

Displaying 20 results from an estimated 10000 matches similar to: "tune()"

2006 Jul 07
1
Polynomial kernel in SVM in e1071 package
Dear list, In some places (for example, http://en.wikipedia.org/wiki/Support_vector_machine) , the polynomail kernel in SVM is written as (u'*v + 1)^d, while in the document of svm() in e1071 package, the polynomial kernel is written as (gamma*u'*v + coef0)^d. I am a little confused here: When doing parameter optimization (grid search or so) for polynomial kernel, does it need to tune
2003 Nov 03
1
svm in e1071 package: polynomial vs linear kernel
I am trying to understand what is the difference between linear and polynomial kernel: linear: u'*v polynomial: (gamma*u'*v + coef0)^degree It would seem that polynomial kernel with gamma = 1; coef0 = 0 and degree = 1 should be identical to linear kernel, however it gives me significantly different results for very simple data set, with linear kernel
2004 Dec 01
1
tuning SVM's
Hi I am doing this sort of thing: POLY: > > obj = best.tune(svm, similarity ~., data = training, kernel = "polynomial") > summary(obj) Call: best.tune(svm, similarity ~ ., data = training, kernel = "polynomial") Parameters: SVM-Type: eps-regression SVM-Kernel: polynomial cost: 1 degree: 3 gamma: 0.04545455 coef.0: 0
2008 Jan 02
1
Plot.svm error
Hi all, Sorry to be bothering again with probably an easy error to fix, but I've been trying to solve the problem and haven't been able yet to do it. So I'm doing this: > dados<-read.table("b.txt",sep="",nrows=30000) >
2009 Mar 26
1
Extreme AIC in glm(), perfect separation, svm() tuning
Dear List, With regard to the question I previously raised, here is the result I obtained right now, brglm() does help, but there are two situations: 1) Classifiers with extremely high AIC (over 200), no perfect separation, coefficients converge. in this case, using brglm() does help! It stabilize the AIC, and the classification power is better. Code and output: (need to install package:
2007 Dec 27
1
(package e1071) SVM tune for best parameters: why they are different everytime i run?
Hi, I run the following tuning function for svm. It's very strange that every time i run this function, the best.parameters give different values. [A] >svm.tune <- tune(svm, train.x, train.y, validation.x=train.x, validation.y=train.y, ranges = list(gamma = 2^(-1:2), cost = 2^(-3:2))) # where train.x and train.y are matrix
2003 Dec 10
3
e1071:svm - default epsilon = 0.1 (NOT 0.5) (PR#5671)
In e1071 package/svm default epsilon value is set to 0.1 and not 0.5 as documentation says. R
2005 May 19
2
tune.svm in {e1071}
Dear All , 1- I'm trying to access the values of fitted(model) after model<- tune.svm( ) but seemingly it is not poosible. How can I access to values of fitted ? However ,it is possible only after model<- svm( ) 2- How can I access to the other values such as the number of Support Vectors , gamma, cost , nu , epsilon , after model<- tune.svm( ) ? these are not possible? I
2010 May 13
1
tune svm
Hello, I hope you can help me! I`m trying to tune svm parameters: cost and gamma for a landsat image classification, but I get an error and I can't understand it. I write this: > tune(svm, Class~., data = mdt01bis, ranges = list(gamma = 2^(-15:3), cost > = 2^(-5:15))) and R gives: Error en predict.svm(model, if (!is.null(validation.x)) validation.x else if (useFormula)
2005 Apr 26
3
Error using e1071 svm: NA/NaN/Inf in foreign function call
Hello, As far I saw in archive mailing list, I am not the first person with this problem. Anyway I was not able to pass this error once the information I got from the archive it is not very conclusive for this case. I have used linear, radial and sigmoid kernels for the same data in the same conditions and everything is ok. This problem just happens with the polynomial kernel. I send the
2009 Jul 18
1
svm works but tune.svm give error
Hello, I'm using the e1071 library for SVM functions. I can quickly train an SVM with: svm(formula = label ~ ., data = testdata) That works well. I want to tune the parameters, so I tried: tune.svm(label ~ ., data=testdata[1:2000, ], gamma=10^(-6:3), cost=10^(1:2)) THIS FAILS WITH AN ERROR: 'names' attribute [199] must be the same length as the vector [184] I don't
2011 Apr 04
1
Problem using svm.tune
Dear Sir, I am stuck with a nagging problem in using R for SVM regression. My data has 5 dimensions and 400 observations. The independent variables are : Peb, Ksub, Sub, and Xtt. The dependent variable is: Rexp. I tried using the svm.tune function as well as <_tune(svm.....), to tune the hyper parameters: gamma, epsilon and C. Since I am new to R, I am probably not using the svm.tune
2007 Mar 14
1
tune.svm
I use tune.svm to tune gamma and cost for my training dataset. I use PC, it runs very slowly. Does anyone know how to make it faster? Aimin
2011 May 25
1
help with tune.svm() e1071
Hi, I am trying to use tune.svm in e1071 package. the command i use is tobj <- tune.svm(labels, data= data, cost = 10^(1:2)) Should the last column of the 'data' contain the labels as well? I want to use the linear kernel. But it gives me the error "Error in model.frame.default(formula, data) : 'data' must be a data.frame, not a matrix or an array" Do you know why
2004 Dec 21
2
Rgui.exe - Error while tuning svm
Hello, if I try to tune my svm with the code: Tune <- tune.svm(Data.Train, Class.Train, type="C-classification", kernel="radial", gamma = 2^(-1:1), cost = 2^(2:4)) i get a windows Messagebox with a error in the application "Rgui.exe" and the message: "Die Anweisung in 0x6c48174d verweist auf Speicher 0x00000000. Der Vorgang "read" konnte nicht auf
2010 Oct 25
1
online course: SVM in R with Lutz Hamel at statistics.com
Support vector machines (SVMs) have established themselves as one of the preeminent machine learning models for classification and regression over the past decade or so, frequently outperforming artificial neural networks in task such as text mining and bioinformatics. Dr. Lutz Hamel, author of "Knowledge Discovery with Support Vector Machines" from Wiley will present his online course
2009 Jul 28
1
Watching tune parameters for SVM?
Hi, I'm switch over from RapidMiner to R. (The learning curve is steep, but there is so much more I can do with R and it runs much faster overall.) In RapidMiner, I can "tune" a parameter of my svm in a nice cross validation loop. The process will print out the progress as it goes. So for a 5x cross tuning for the value of C with auc as my performance measure, I see XV C
2008 Mar 02
2
listing components of an object
Is there a method to list the components of an object, instead of looking at the help for that method? Let me be more clear with an example data(iris) ## tune `svm' for classification with RBF-kernel (default in svm), ## using one split for training/validation set obj <- tune(svm, Species~., data = iris, ranges = list(gamma = 2^(-1:1), cost = 2^(2:4)),
2010 Jun 24
1
help in SVM
HI, GUYS, I used the following codes to run SVM and get prediction on new data set hh. dim(all_h) [1] 2034 24 dim(hh) # it contains all the variables besides the variables in all_h data set. [1] 640 415 require(e1071) svm.tune<-tune(svm, as.factor(out) ~ ., data=all_h, ranges=list(gamma=2^(-5:5), cost=2^(-5:5)))# find the best parameters. bestg<-svm.tune$best.parameters[[1]]
2008 Jan 03
0
Svm formula
Hi all, I don't know how to choose the formula to use when plotting an svm model, I think I'm using the wrong one and so that is why I'm having trouble. I should be very grateful if someone could help me on this.. > dados<-read.table("b.txt",sep="",nrows=30000) >