Jokel Meyer
2011-Sep-24 11:12 UTC
[R] Assessing prediction performance of SVM using e1071 package
Dear R-Users! I am using the svm function (e1071 package) to classify two groups using a set of 180 indicator variables. Now I am confused about the cross-validation procedure. (A) On one hand I use the setting cross=10 in the svm function to run 10 cross-validation iterations and to get an estimate of the svm's performance in prediction. (B) On the other hand most tutorials I found recommended separating the set into two sets using one set for training of the svm and the other for testing the predictive power of the trained svm. My understanding would be that (A) and (B) are alternative ways to estimate the performance of the svm. Or would I have to implement both? Many thanks for your help! Jokel [[alternative HTML version deleted]]
Maybe Matching Threads
- Workflow for binary classification problem using svm via e1071 package
- crossvalidation in svm regression in e1071 gives incorre ct results (PR#8554)
- problems in cross validation of SVM in pakage "e1071"
- crossvalidation in svm regression in e1071 gives incorrect results (PR#8554)
- Problems to run SVM regression with e1071