Displaying 20 results from an estimated 30000 matches similar to: "Cross Validation with SVM"
2010 Nov 23
5
cross validation using e1071:SVM
Hi everyone
I am trying to do cross validation (10 fold CV) by using e1071:svm method. I
know that there is an option (?cross?) for cross validation but still I
wanted to make a function to Generate cross-validation indices using pls:
cvsegments method.
#####################################################################
Code (at the end) Is working fine but sometime caret:confusionMatrix
2009 Jul 08
1
SVM cross validation in e1071
Hi list,
Could someone help me to explain why the leave-one-out cross validation results I got from svm using the internal option "cross" are different from those I got manually? It seems using "cross" to do cross validation, the results are always better. Please see the code below. I also include lda as a comparison.
I'm using WinXP, R-2.9.0, and e1071_1.5-19.
Many
2012 Dec 02
1
e1071 SVM: Cross-validation error confusion matrix
Hi,
I ran two svm models in R e1071 package: the first without cross-validation
and the second with 10-fold cross-validation.
I used the following syntax:
#Model 1: Without cross-validation:
> svm.model <- svm(Response ~ ., data=data.df, type="C-classification",
> kernel="linear", cost=1)
> predict <- fitted(svm.model)
> cm <- table(predict,
2012 Mar 02
1
e1071 SVM: Cross-validation error confusion matrix
Hi,
I ran two svm models in R e1071 package: the first without cross-validation
and the second with 10-fold cross-validation.
I used the following syntax:
#Model 1: Without cross-validation:
> svm.model <- svm(Response ~ ., data=data.df, type="C-classification",
> kernel="linear", cost=1)
> predict <- fitted(svm.model)
> cm <- table(predict,
2013 Jan 15
0
e1071 SVM, cross-validation and overfitting
I am accustomed to the LIBSVM package, which provides cross-validation
on training with the -v option
% svm-train -v 5 ...
This does 5 fold cross validation while building the model and avoids
over-fitting.
But I don't see how to accomplish that in the e1071 package. (I
learned that svm(... cross=5 ...) only _tests_ using cross-validation
-- it doesn't affect the training.) Can
2006 Feb 02
0
crossvalidation in svm regression in e1071 gives incorre ct results (PR#8554)
1. This is _not_ a bug in R itself. Please don't use R's bug reporting
system for contributed packages.
2. This is _not_ a bug in svm() in `e1071'. I believe you forgot to take
sqrt.
3. You really should use the `tot.MSE' component rather than the mean of
the `MSE' component, but this is only a very small difference.
So, instead of spread[i] <- mean(mysvm$MSE), you
2006 Feb 02
0
crossvalidation in svm regression in e1071 gives incorrect results (PR#8554)
Full_Name: Noel O'Boyle
Version: 2.1.0
OS: Debian GNU/Linux Sarge
Submission from: (NULL) (131.111.8.96)
(1) Description of error
The 10-fold CV option for the svm function in e1071 appears to give incorrect
results for the rmse.
The example code in (3) uses the example regression data in the svm
documentation. The rmse for internal prediction is 0.24. It is expected the
10-fold CV rmse
2007 Oct 27
1
problems in cross validation of SVM in pakage "e1071"
Hi:
I am a newer in using R for data mining, and find the "e1071" pakage an excellent tool in doing data mining work!
what frustrated me recently is that when I using the function "svm" and using the "cross=10" parameters, I got all the "accuracies" of the model greater than 1. Isn't that the accuracy should be smaller than 1? so I wander how, the
2010 Sep 11
0
[Q] How to extract cross validation results from e1071's svm model
Dear all,
Is it possible to extract cross-validation results from e1071's svm model?
For example, the following R code shows the result from the 10 fold cross-validation.
model = svm(spam ~ ., data = spam, cross = 10)
summary(model)
But, I could not figure out how to get to the accuracy values from the cross-validation. I looked at the svm method, but did not find any return values.
Any
2003 Nov 03
1
svm in e1071 package: polynomial vs linear kernel
I am trying to understand what is the difference between linear and
polynomial kernel:
linear: u'*v
polynomial: (gamma*u'*v + coef0)^degree
It would seem that polynomial kernel with gamma = 1; coef0 = 0 and degree
= 1
should be identical to linear kernel, however it gives me significantly
different results for very simple
data set, with linear kernel
2010 Jun 15
1
cross validation of SVM
hi,
could you please tell me what kind of cross validation that SVM of e1071 uses?
Cheers,
Amy
_________________________________________________________________
View photos of singles in your area! Looking for a hot date?
[[alternative HTML version deleted]]
2011 Aug 05
1
e1071 ver 1.5-27 and older - SVM bug report
Dear All:
I found a problem with the SVM internal cross-validation (CV) accuracy
estimation in the e1071 package.
File: Rsvm.c
Line: 120
Today, it is:
int j = rand()%(prob->l-i);
Should be:
int j = i + rand()%(prob->l-i);
The erroneous code doesn't shuffle objects. Instead, it "randomly"
moves objects from beginning to the end.
In hope for a prompt response from the
2007 Jan 22
0
Recursive-SVM (R-SVM)
I am trying to implement a simple r-svm example using the iris data (only two of the classes are taken and data is within the code). I am running into some errors. I am not an expert on svm's. If any one has used it, I would appreciate their help. I am appending the code below.
Thanks../Murli
#######################################################
### R-code for R-SVM
### use leave-one-out
2013 Apr 04
1
Extract the accuracy of 10-CV
Hello guys!
I am working with some classifiers ( SVM,C4.5,RNA,etc) using 10-C.V.
Once I have the model of each one, I make the validation of these models in
one dataset. Then,with my model and the dataset, I extract a confusion
matrix to know the capacity of prediction from the model. And finally, I
extract the accuracy of this prediction based on the diagonal from the
confusion matrix.
The
2006 Feb 16
1
reg cross validation in svm
Hi
My name is karthikeyan.
I am using svm in R for my data set.
my data set contain 60 finance ratio as variables and i want to classify
into group of good and bad.
I want to know how to do the crossvalidation for the svm .
first i am doing modelling and i am predict and i am calculating the
probabilities
how can i do the cross validation and how can i plot the svm for this
variables
2010 Mar 23
1
caret package, how can I deal with RFE+SVM wrong message?
Hello,
I am learning caret package, and I want to use the RFE to reduce the
feature. I want to use RFE coupled Random Forest (RFE+FR) to complete this
task. As we know, there are a number of pre-defined sets of functions, like
random Forest(rfFuncs), however,I want to tune the parameters (mtr) when
RFE, and then I write code below, but there is something wrong message, How
can I deal with it?
2009 Oct 19
1
Best SVM Performance measure?
Hi,
This is probably going to be one of those, "It depends what you want"
kind of answers, but I'm very curious to see if the group has an opinion
or some general suggestions.
The actual experiment is too complicated for a quick e-mail, but I'll
summarize well enough(hopefully) to get the concepts across.
Binary classification problem
Using and SVM (e1071) to train a model
2007 Sep 25
1
10- fold cross validation for naive bayes(e1071)
Hallo!
I would need a code for 10-fold cross validation for the classifiers Naive Bayes and svm (e1071) package. Has there already been done something like that?
I tried to do it myself by applying the tune function first:
library(e1071)
tune.control <- tune.control(random =F, nrepeat=1, repeat.aggregate=min.,sampling=c("cross"),sampling.aggregate=mean, cross=10, best.model=T,
2010 Nov 25
0
[libsvm] predict function error
Dear R users,
There is a error message when I run the following code. It is used to load
microarray data and use the top 1000 genes for training svm to classify test
set .
> library(e1071)
Loading required package: class
> f=read.table("F:\\lab\\
microarray analysis\\VEH LPS\\exprs.txt",
2009 Oct 14
0
Confusion matrix from cross validation in R:
Hey!
How do I get the confusion matrix after performing 10-fold cross validation
from SVM in R?
When I try to print it, I get the confusion matrix without cross validation.
I need to compute PPV. Should I report PPV without CV and total accuracy
with CV?
I am confused.
> svmtrain <- svm(xtrain,ytrain,kernel="sigmoid",cross=10)
> pred <- predict(svmtrain, xtrain)
>