similar to: SVM: stratified cross-validation

Displaying 20 results from an estimated 90000 matches similar to: "SVM: stratified cross-validation"

2010 Nov 23
5
cross validation using e1071:SVM
Hi everyone I am trying to do cross validation (10 fold CV) by using e1071:svm method. I know that there is an option (?cross?) for cross validation but still I wanted to make a function to Generate cross-validation indices using pls: cvsegments method. ##################################################################### Code (at the end) Is working fine but sometime caret:confusionMatrix
2012 Dec 02
1
e1071 SVM: Cross-validation error confusion matrix
Hi, I ran two svm models in R e1071 package: the first without cross-validation and the second with 10-fold cross-validation. I used the following syntax: #Model 1: Without cross-validation: > svm.model <- svm(Response ~ ., data=data.df, type="C-classification", > kernel="linear", cost=1) > predict <- fitted(svm.model) > cm <- table(predict,
2012 Mar 02
1
e1071 SVM: Cross-validation error confusion matrix
Hi, I ran two svm models in R e1071 package: the first without cross-validation and the second with 10-fold cross-validation. I used the following syntax: #Model 1: Without cross-validation: > svm.model <- svm(Response ~ ., data=data.df, type="C-classification", > kernel="linear", cost=1) > predict <- fitted(svm.model) > cm <- table(predict,
2009 Jul 08
1
SVM cross validation in e1071
Hi list, Could someone help me to explain why the leave-one-out cross validation results I got from svm using the internal option "cross" are different from those I got manually? It seems using "cross" to do cross validation, the results are always better. Please see the code below. I also include lda as a comparison. I'm using WinXP, R-2.9.0, and e1071_1.5-19. Many
2013 Jan 15
0
e1071 SVM, cross-validation and overfitting
I am accustomed to the LIBSVM package, which provides cross-validation on training with the -v option % svm-train -v 5 ... This does 5 fold cross validation while building the model and avoids over-fitting. But I don't see how to accomplish that in the e1071 package. (I learned that svm(... cross=5 ...) only _tests_ using cross-validation -- it doesn't affect the training.) Can
2005 Jan 20
2
Cross-validation accuracy in SVM
Hi all - I am trying to tune an SVM model by optimizing the cross-validation accuracy. Maximizing this value doesn't necessarily seem to minimize the number of misclassifications. Can anyone tell me how the cross-validation accuracy is defined? In the output below, for example, cross-validation accuracy is 92.2%, while the number of correctly classified samples is (1476+170)/(1476+170+4) =
2010 Sep 11
0
[Q] How to extract cross validation results from e1071's svm model
Dear all, Is it possible to extract cross-validation results from e1071's svm model? For example, the following R code shows the result from the 10 fold cross-validation. model = svm(spam ~ ., data = spam, cross = 10) summary(model) But, I could not figure out how to get to the accuracy values from the cross-validation. I looked at the svm method, but did not find any return values. Any
2007 Oct 27
1
problems in cross validation of SVM in pakage "e1071"
Hi: I am a newer in using R for data mining, and find the "e1071" pakage an excellent tool in doing data mining work! what frustrated me recently is that when I using the function "svm" and using the "cross=10" parameters, I got all the "accuracies" of the model greater than 1. Isn't that the accuracy should be smaller than 1? so I wander how, the
2006 Feb 16
1
reg cross validation in svm
Hi My name is karthikeyan. I am using svm in R for my data set. my data set contain 60 finance ratio as variables and i want to classify into group of good and bad. I want to know how to do the crossvalidation for the svm . first i am doing modelling and i am predict and i am calculating the probabilities how can i do the cross validation and how can i plot the svm for this variables
2010 Jun 15
1
cross validation of SVM
hi, could you please tell me what kind of cross validation that SVM of e1071 uses? Cheers, Amy _________________________________________________________________ View photos of singles in your area! Looking for a hot date? [[alternative HTML version deleted]]
2013 Apr 05
0
Cross Validation with SVM
Good morning. I am using package e1071 to develop a SVM model. My code is: x <- subset(dataset, select = -Score) y <- dataset$Score model <- svm(x, y,cross=10) print(model) summary(model) As 10-CV produces 10 models, I need two things: 1) To have access to each model from 10-CV. 2) To predict new instances with each model to know which one does the best performance.
2004 Jun 08
0
bootstrap: stratified resampling
Dear All, I was writing a small wrapper to bootstrap a classification algorithm, but if we generate the indices in the "usual way" as: bootindex <- sample(index, N, replace = TRUE) there is a non-zero probability that all the samples belong to only one class, thus leading to problems in the fitting (or that some classes will end up with only one sample, which will be a problem
2007 Jul 23
4
nnet 10-fold cross-validation
Hi It clear that to do a classification with svm under 10-fold cross validation one uses svm(Xm, newlabs, type = "C-classification", kernel = "linear",cross = 10) What corresponds to the nnet? nnet(.....,cross=10)? Regards
2009 Oct 14
0
Confusion matrix from cross validation in R:
Hey! How do I get the confusion matrix after performing 10-fold cross validation from SVM in R? When I try to print it, I get the confusion matrix without cross validation. I need to compute PPV. Should I report PPV without CV and total accuracy with CV? I am confused. > svmtrain <- svm(xtrain,ytrain,kernel="sigmoid",cross=10) > pred <- predict(svmtrain, xtrain) >
2007 Sep 25
1
10- fold cross validation for naive bayes(e1071)
Hallo! I would need a code for 10-fold cross validation for the classifiers Naive Bayes and svm (e1071) package. Has there already been done something like that? I tried to do it myself by applying the tune function first: library(e1071) tune.control <- tune.control(random =F, nrepeat=1, repeat.aggregate=min.,sampling=c("cross"),sampling.aggregate=mean, cross=10, best.model=T,
2008 Feb 27
7
Cross Validation
Hello, How can I do a cross validation in R? Thank You!
2009 Mar 28
1
stratified variables in a cox regression
>Hello, I am hoping for assistance in regards to examining the contribution of stratified variables in a cox regression. A previous post by Terry Therneau noted that "That is the point of a strata; you are declaring a variable to NOT be proportional hazards, and thus there is no single "hazard ratio" that describes it". Given this purpose of stratification, in the
2008 Nov 14
0
Cross-validation
Hi, I was trying to do cross-validation using the crossval function (bootstrap package), with the following code: --------------------------------------------------------------------------------------------------------- theta.fit <- function(x,y){ model <- svm(x,y,kernel = "linear") } theta.predict <- function(fit,x){ prediction <- predict(fit,x)
2005 Jan 21
2
cross validation
How to select training data set and test data set from the original data for performing cross-validation --------------------------------- [[alternative HTML version deleted]]
2009 Jul 12
1
Splitting dataset for Tuning Parameter with Cross Validation
Hi, My question might be a little general. I have a number of values to select for the complexity parameters in some classifier, e.g. the C and gamma in SVM with RBF kernel. The selection is based on which values give the smallest cross validation error. I wonder if the randomized splitting of the available dataset into folds is done only once for all those choices for the parameter values, or