similar to: ksvm question -- help! line search failed...

Displaying 20 results from an estimated 200 matches similar to: "ksvm question -- help! line search failed..."

2009 Jul 07
1
ksvm question -- help! cannot get program to run...
What's wrong? Very sad about this... model <- ksvm(x=mytraindata[, -1], y=factor(mytraindata[, 1]), prob.model=T) Error in .local(x, ...) : x and y don't match.
2009 Jul 10
1
help! Error in using Boosting...
Here is my code: mygbm<-gbm.fit(y=mytraindata[, 1], x=mytraindata[, -1], interaction.depth=4, shrinkage=0.001, n.trees=20000, bag.fraction=1, distribution="bernoulli") Here is the error: Error in gbm.fit(y = mytraindata[, 1], x = mytraindata[, -1], interaction.depth = 4, : The dataset size is too small or subsampling rate is too large: cRows*train.fraction*bag.fraction <=
2009 Jul 14
2
SOS! error in GLM logistic regression...
Hi all, Could anybody tell me what happened to my logistic regression in R? mylog=glm(mytraindata$V1 ~ ., data=mytraindata, family=binomial("logit")) It generated the following error message: Error in model.frame.default(Terms, newdata, na.action = na.action, xlev = object$xlevels) : factor 'state1' has new level(s) AP Thank you!
2009 Jul 07
2
Question in using e1071 svm routine
Hi all, I've got the following error message in using e1071 svm routine... Could anybody please help me? Thank you! --------------------------------- model <- svm(y=factor(mytraindata[, 1]), x=mytraindata[, -1], probability=T) Error in if (any(co)) { : missing value where TRUE/FALSE needed In addition: Warning message: In FUN(newX[, i], ...) : NAs introduced by coercion
2009 Oct 23
1
Data format for KSVM
Hi, I have a process using svm from the e1071 library. it works. I want to try using the KSVM library instead. The same data used wiht e1071 gives me an error with KSVM. My data is a data.frame. sample code: svm_formula <- formula(y ~ a + B + C) svm_model <- ksvm(formula, data=train_data, type="C-svc", kernel="rbfdot", C=1) I get the following error:
2009 Nov 29
2
kernlab's ksvm method freeze
Hello, I am using kernlab to do some binary classification on aminoacid strings. I am using a custom kernel, so i use the kernel="matrix" option of the ksvm method. My (normalized) kernel matrix is of size 1309*1309, my results vector has the same length. I am using C-svc. My kernlab call is something similiar to this: ksvm(kernel="matrix", kernelMatrix, trainingDataYs,
2011 Aug 26
1
kernlab: ksvm() bug?
Hello all, I'm trying to run a gird parameter search for a svm. Therefore I'M using the ksvm function from the kernlab package. ---- svp <- ksvm(Ktrain,ytrain,type="nu-svc",nu=C) ---- The problem is that the optimization algorithm does not return for certain parameters. I tried to use setTimeLimit() but that doesn't seem to help. I suspect that ksvm() calls c code that
2012 Aug 19
1
kernlab | ksvm error
Dear list, I am using the ksvm function from kernlab as follows: (1) learning > svm.pol4 <- ksvm(class.labs ~ ., data = train.data, prob.model = T, scale = T, kernel = "polydot") (2) prediction > svm.pol.prd4 <- predict(svm.pol4, train.data, type = "probabilities")[,2] But unfortunately, when calling the prediction, once in every 10s of times (using the exact
2010 Jun 11
1
Decision values from KSVM
Hi, I'm working on a project using the kernlab library. For one phase, I want the "decision values" from the SVM prediction, not the class label. the e1071 library has this function, but I can't find the equivalent in ksvm. In general, when an SVM is used for classification, the label of an unknown test-case is decided by the "sign" of its resulting value as
2010 Sep 30
1
Can this code be written more efficiently?
Dear users, I'm working on binary classification problem using Support Vector Machines (SVM). My objective is to train a series of SVM models on a grid of hyperparameters and then select those that maximize the AUC based on an independent validation sample. My attempted code is shown below. It runs well on "small" data sets but when I use it on a slightly larger sample (e.g., my
2009 Sep 06
2
Regarding SVM using R
Hi Abbas, Before I try to give you answers, I just want to mention that you should send R related reqests to the R-help list, and not me personally because (i) there's a greater likelihood that it will get answered in a timely manner, and (ii) people who might have a similar problem down the road might benefit from any answer via searching the list archives ... anyway: On Sep 5, 2009, at
2008 Sep 14
0
ksvm accessing the slots of S4 object
I am using kernlab to build svm models. I am not sure how to access the different slots of the object. For instance if I want to get the nuber of support vectors for each of model I am building and store it in a vector. >ksvm.model <- ksvm(Class ~ ., data = somedata,kernel = "vanilladot", cross = 10, type ="C-svc") >names(attributes(ksvm.model)) [1] "param"
2007 Oct 30
0
kernlab/ ksvm: class.weights & prob.model in binary classification
Hello list, I am faced with a two-class classification problem with highly asymetric class sizes (class one: 99%, class two: 1%). I'd like to obtain a class probability model, also introducing available information on the class prior. Calling kernlab/ksvm with the line > ksvm_model1<-ksvm(as.matrix(slides), as.factor(Class), class.weights= c("0" =99, "1" =1),
2007 Aug 14
0
kernlab ksvm() cross-validation prediction response vector
Hello, I would like to know, whether for the support vector classification function ksvm() the response values stored in object at ymatrix are cross validated outputs/predictions: Example code from package kernlab, function ksvm: library(kernlab) ## train a support vector machine filter <- ksvm(type~.,data=spam,kernel="rbfdot",kpar=list(sigma=0.05),C=5,cross=3) filter filter at
2011 May 28
0
how to train ksvm with spectral kernel (kernlab) in caret?
Hello all, I would like to use the train function from the caret package to train a svm with a spectral kernel from the kernlab package. Sadly a svm with spectral kernel is not among the many methods in caret... using caret to train svmRadial: ------------------ library(caret) library(kernlab) data(iris) TrainData<- iris[,1:4] TrainClasses<- iris[,5] set.seed(2)
2007 Aug 08
0
ksvm-kernel
HI I am new to R. I have one problem in the predict function of the kernlab. I want to use ksvm and predict with kernelmatrix (S4 method for signature 'kernelMatrix') #executing the following sentences library(kernlab) # identity kernel k <- function(x,y) { n<-length(x) cont<-0 for(i in 1:n){ if(x[i]==y[i]){ cont<-cont+1 } } cont } class(k) <-
2009 Oct 06
0
Kernlab: multidimensional targets in rvm(), ksvm(), gausspr()
Hi there, I'm trying to do a regression experiment on a multidimensional dataset where both x and y in the model are multidimensional vectors. I'm using R version 2.9.2, updated packages, on a Linux box. I've tried gausspr(), ksvm() and rvm(), and the models are computed fine, but I'm always getting the same error message when I try to use predict(): "Error in
2012 Aug 27
0
kernlab`s custom kernel of ksvm freeze
Hello, together I'm trying to use user defined kernel. I know that kernlab offer user defined kernel(custom kernel functions) in R. I used data spam including package kernlab. (number of variables=58 number of examples =4061) i'm user defined kernel's form, kp=function(d,e){ as=v*d bs=v*e cs=as-bs cs=as.matrix(cs) exp(-(norm(cs,"F")^2)/2) }
2012 May 05
2
Pasting with Quotes
Hello useRs! So, I have a random question. I'm trying to build a character string, then evaluate it. I think an example would be the easiest way to explain: kern.vec = c("rbfdot","polydot") for( j in 1:length( kern.vec ) ) { formula = paste("ksvm( ind ~ . , data=d.temp[,c(ind_col,dep_cols)], kernel =",kern.vec[j],", prob.model=T
2009 Apr 28
1
kernlab - custom kernel
hi, I am using R's "kernlab" package, exactly i am doing classification using ksvm(.) and predict.ksvm(.).I want use of custom kernel. I am getting some error. # Following R code works (with promotergene dataset): library("kernlab") s <- function(x, y) { sum((x*y)^1.25) } class(s) <- "kernel" data("promotergene") gene <- ksvm(Class ~ .,