Dear all, I have set a model for SVM analysis using laplacedot with the package kernlab. I checked the classification error with a k-fold approach, that is I analyzed 1/10 of the data ten times and averaged the error (FalseNeg + FalsePos) / TOT. I tested different levels of cost C and the results are: C error 0.01 0.106566 0.10 0.070798 0.50 0.000985 1.00 0.000556 2.50 0.000198 5.00 0.000079 7.50 0.000040 8.00 0.000040 8.50 0.000016 9.00 0.000008 9.50 0.000000 10.00 0.000000 10.50 0.000000 100.00 0.000000 Given that the purpose of the optimization is to minimize the error, a C>=9 is therefore what I am looking for. But, if the model is too stringent, then I will have problems with the future sets. So what level should I set? My feeling is that C=1 is enough. Is there a method within kernlab to maximize C (and gamma perhaps) automatically? -- Best regards, Luigi