Displaying 20 results from an estimated 30000 matches similar to: "do.call and double-colon access"
2011 Sep 08
1
error in knn: too many ties in knn
Hello.
I found the behavior of knn(
http://stat.ethz.ch/R-manual/R-devel/library/class/html/knn.html) function
looking very strange.
Consider the toy example.
> library(class)
> train <- matrix(nrow=5000,ncol=2,data=rnorm(10000,0,1))
> test <- matrix(nrow=10,ncol=2,data=rnorm(20,0,1))
> cl <- rep(c(0,1),2500)
> knn(train,test,cl,1)
[1] 1 1 0 0 1 0 1 1 0 1
Levels: 0 1
It
2004 Feb 23
2
outputs of KNN prediction
Hello there:
I got 13 variables in my training/target set, the first 12 variables are
mixture of numerical and categorical variables. The last one is the one I
need
to predict, and it is a numerical variable.
>train<-read.table("train.txt")
>test<-read.table("test.txt")
>cl<-factor(train[,13])
>pred<-knn(train, test, clk=3, prob=TRUE)
>pred
I got
2005 Mar 15
1
KNN one factor predicting problem
Could anybody help me out please?
> cl<-as.factor(traindata[,13])
> knn(traindata[1:295,2], newdata[1:32,2], cl,k=2,
prob=TRUE)
Error in knn(traindata[1:295, 2], newdata[1:32, 2],
cl, k = m, prob = TRUE) :
Dims of test and train differ
Both traindata and newdata have 13 elements. Only one
of the first 12 elemnets is needed to predict the 13
element.
What's the problem of
2004 May 05
1
Segfault from knn.cv in class package (PR#6856)
The function knn.cv in the class package doesn't have error checking to
ensure that the length of the classlabel argument is equal to the number
of rows in the test set. If the classlabel is short, the result is often
a segfault.
> library(class)
> dat <- matrix(rnorm(1000), nrow=10)
> cl <- c(rep(1,5), rep(2,5))
> cl2 <- c(rep(1,5), rep(2,4))
> knn.cv(dat, cl)
[1] 2
2006 Jun 07
1
knn - 10 fold cross validation
Hi,
I was trying to get the optimal 'k' for the knn. To do this I was using the following function :
knn.cvk <- function(datmat, cl, k = 2:9) {
datmatT <- (datmat)
cv.err <- cl.pred <- c()
for (i in k) {
newpre <- as.vector(knn.cv(datmatT, cl, k = i))
cl.pred <- cbind(cl.pred, newpre)
cv.err <- c(cv.err, sum(cl != newpre))
}
2004 Mar 29
1
Interpreting knn Results
Maybe you should show your colleague how to access help pages in R? Right
in ?knn, it says:
prob: If this is true, the proportion of the votes for the winning
class are returned as attribute 'prob'.
so 1.0 mean all three NNs are of the `winning'; i.e., predicted, class, and
0.66667 means 2 out of the 3 NNs are of the winning class, etc.
Andy
> From: Ko-Kang
2005 Jul 06
1
Error message NA/NaN/Inf in foreign function call (arg 6) when using knn()
I am trying to use knn to do a nearest neighbor classification. I tried using my dataset and got an error message so I used a simple example to try and understand what I was doing wrong and got the same message. Here is what I typed into R:
try
[,1] [,2] [,3] [,4]
r "A" "A" "T" "G"
r "A" "A" "T" "G"
f
2005 Jul 05
1
Getting runtime error in stepclass
Hi!
I got the following runtime error when I tried to use svm method with
stepclass.
Error in "colnames<-"(`*tmp*`, value = c("0", "1")) :
attempt to set colnames on object with less than two dimensions
I repeated the same sequence of statements but this time I used the
classification function used in the example, i.e., "lda" and it worked
fine
2005 Oct 06
1
how to use tune.knn() for dataset with missing values
Hi Everybody,
i again have the problem in using tune.knn(), its giving an error saying
missing values are not allowed.... again here is the script for
BreastCancer Data,
library(e1071)
library(mda)
trdata<-data.frame(train,row.names=NULL)
attach(trdata)
xtr <- subset(trdata, select = -Class)
ytr <- Class
bestpara <-tune.knn(xtr,ytr, k = 1:25, tunecontrol = tune.control(sampling
2011 Aug 31
8
!!!function to do the knn!!!
hi, r users
i have a problem with KNN.
i have 2 datasets, X0 and X1.
>dim(X0)
>1471*13
dim(X1)
>5221*13
and for every instances in the dataset X1, i want to find the nearest
neighbour(1nn) in the dataset X0.
and i dont have the true classifications of dataset X1.
but the function knn() need true classifications(cl) to do prediction.
i just curious if there are some other function
2013 Feb 07
1
Saving model and other objects from caret
Say I train a model in caret, e.g.:
RFmodel <- train(X,Y,method='rf',trControl=myCtrl,tuneLength=1)
How can I save this to disk and load it later in R?
How about an object of the class "resamples"?
resamps <- resamples(
list( RF = RFmodel,
SVM = SVMmodel,
KNN = KNNmodel,
NN = NNmodel
))
Thanks,
2004 Jan 09
3
ipred and lda
Dear all,
can anybody help me with the program below? The function predict.lda
seems to be defined but cannot be used by errortest.
The R version is 1.7.1
Thanks in advance,
Stefan
----------------
library("MASS");
library("ipred");
data(iris3);
tr <- sample(1:50, 25);
train <- rbind(iris3[tr,,1], iris3[tr,,2], iris3[tr,,3]);
test <- rbind(iris3[-tr,,1],
2010 Jun 30
1
how to tabulate the prediction value using table function for naive baiyes in R
Hi,
I have written a code in R for classifying microarray data using naive
bayes, the code is given below:
library(e1071)
train<-read.table("Z:/Documents/train.txt",header=T);
test<-read.table("Z:/Documents/test.txt",header=T);
cl <- c(c(rep("ALL",10), rep("AML",10)));
cl <- factor(cl)
model <- naiveBayes(train,cl);
2008 Sep 19
3
How to do knn regression?
Hello,
I want to do regression or missing value imputation by knn. I searched
r-help mailing list. This question was asked in 2005. ksmooth and loess
were recommended. But my case is different. I have many predictors (p>20)
and I really want try knn with a given k. ksmooth and loess use band width to define
neighborhood size. This contrasts to knn's variable band width via fixing
a
2010 Jun 30
1
help on naivebayes function in R
Hi,
I have written a code in R for classifying microarray data using naive
bayes, the code is given below:
library(e1071)
train<-read.table("Z:/Documents/train.txt",header=T);
test<-read.table("Z:/Documents/test.txt",header=T);
cl <- c(c(rep("ALL",10), rep("AML",10)));
cl <- factor(cl)
model <- NaiveBayes(train,cl);
2010 Jun 29
2
Need help for SVM code for microarray classification
Hi I am Aadhithya I am trying to write a code to classify microarray data
(AML and ALL) using SVM in R
my code goes like this :
library(e1071)
train<-read.table("Z:/Documents/train.txt",header=T);
test<-read.table("Z:/Documents/test.txt",header=T);
cl <- c(c(rep("ALL",10), rep("AML",10)));
model<- svm(train,cl);
pred <-
2010 Aug 30
2
Regarding naive baysian classifier in R
Hi,
I have a small doubt regarding naive Bayes. I am able to classify the
data's properly but i am just stuck up with how to get the probability
values for naive bayes. In the case of SVM we have "attr" function that
helps in displaying the probability values. Is there any function similar to
"attr" in naive Bayes that can be used for displaying the attribute values.
my
2005 Mar 21
1
How to do knn regression
How can I do a simple k nearest neighbor regression in R? My training
data have 1 predictor and 1 outcome, both are numeric. I also need to
use FPE and SC to find the optimal model. I know there is knn() in
class package, but it's for knn classification. I also find a kknn
package. What function should I use?
Thanks in advance!
Menghui
2012 Nov 23
1
caret train and trainControl
I am used to packages like e1071 where you have a tune step and then pass your tunings to train.
It seems with caret, tuning and training are both handled by train.
I am using train and trainControl to find my hyper parameters like so:
MyTrainControl=trainControl(
method = "cv",
number=5,
returnResamp = "all",
classProbs = TRUE
)
rbfSVM <- train(label~., data =
2005 Apr 21
1
lda (MASS)
hi!
this is a question about lda (MASS) in R on a particular dataset.
I'm not a specialist about any of this but:
First with the well-known "iris" dataset, I tried using lda to discriminate
versicolor from the other to classes and I got approx. 70% of accuracy
testing on train set. In iris, versicolor stands "between" the 2 other so
one can expect lda not to perform well