similar to: ipred and lda

Displaying 20 results from an estimated 3000 matches similar to: "ipred and lda"

2009 Nov 02
1
modifying predict.nnet() to function with errorest()
Greetings, I am having trouble calculating artificial neural network misclassification errors using errorest() from the ipred package. I have had no problems estimating the values with randomForest() or svm(), but can't seem to get it to work with nnet(). I believe this is due to the output of the predict.nnet() function within cv.factor(). Below is a quick example of the problem I'm
2005 Jun 23
1
errorest
Hi, I am using errorest function from ipred package. I am hoping to perform "bootstrap 0.632+" and "bootstrap leave one out". According to the manual page for errorest, i use the following command: ce632[i]<-errorest(ytrain ~., data=mydata, model=lda, estimator=c("boot","632plus"), predict=mypredict.lda)$error It didn't work. I then tried the
2009 Apr 25
1
Overlapping parameters "k" in different functions in "ipred"
Dear List, I have a question regarding "ipred" package. Under 10-fold cv, for different knn ( = 1,3,...25), I am getting same misclassification errors: ############################################# library(ipred) data(iris) cv.k = 10 ## 10-fold cross-validation bwpredict.knn <- function(object, newdata) predict.ipredknn(object, newdata, type="class") for (i in
2005 Jan 06
1
different result from the same errorest() in library( ipred)
Dear all, Does anybody can explain this: different results got when all the same parameters are used in the errorest() in library ipred, as the following? errorest(Species ~ ., data=iris, model=randomForest, estimator = "cv", est.para=control.errorest(k=3), mtry=2)$err [1] 0.03333333 > errorest(Species ~ ., data=iris, model=randomForest, estimator = "cv",
2003 Jun 24
1
errorest: Error in cv.numeric()
Hi, I am trying to get an error estimation for a classification done using lda. The examples work fine, however I don't get my own code to work. The data is in object d > d class hydrophobicity charge geometry 1 2 6490.0400 1434.9700 610.99902 2 2 1602.0601 400.6030 -5824.00000 3 2 969.0060 260.1360 -415.00000 4 1
2005 Jun 24
1
mypredict.
Hi, I am wondering what does "mypredict.lda<-function(object, newdata)predict(object, newdata=newdata)$class" actually do? I run a few errorest commands in the same function on the same dataset using the same classifier lda. The only difference is some use "cv", other use "boot" and "632plus". They all share one mypredict.lda. Will it cause any
2009 Nov 17
1
Error running lda example: Session Info
> > library(MASS) > Iris <- data.frame(rbind(iris3[,,1], iris3[,,2], iris3[,,3]), + Sp = rep(c("s","c","v"), rep(50,3))) > train <- sample(1:150, 75) > table(Iris$Sp[train]) c s v 22 23 30 > z <- lda(Sp ~ ., Iris, prior = c(1,1,1)/3, subset = train) Error in if (targetlist[i] == stringname) { : argument is of length
2002 Mar 17
3
apply problem
> data(iris) # iris3 is first 3 rows of iris > iris3 <- iris[1:3,] # z compares row 1 to each row of iris3 and is correctly computed > z <- c(F,F,F) > for(i in seq(z)) z[i] <- identical(iris3[1,],iris3[i,]) > z [1] TRUE FALSE FALSE # this should do the same but is incorrect > apply(iris3,1,function(x)identical(x,iris3[1,])) 1 2 3 FALSE FALSE FALSE
2000 Mar 08
3
Reading data for discriminant analysis
Dear R users, I want to do discriminant analysis on my data. I have successfully followed the discriminant analysis in V & R on the iris data: > ir <- rbind (iris3[,,1],iris3[,,2],iris3[,,3]) > ir.species <- c(rep("s",50),rep("c",50),rep("v",50)) > a <- lda(log(ir),ir.species) > a$svd^2/sum(a$svd^2) [1] 0.996498601 0.003501399 > a.x <-
2003 Feb 27
2
PRESS again
Sorry for the repeat. The PRESS statistic is defined as sum(y-yhat(i))^2, where yhat(i) denotes the ith predicted value using all the data except the ith case (as used typically in linear models). Thanks again Jacob Jacob L van Wyk Department of Mathematics and Statistics Rand Afrikaans University P O Box 524 Auckland Park 2006 South Africa Tel: +27-11-489-3080 Fax: +27-11-489-2832
2004 Nov 02
2
lda
Hi !! I am trying to analyze some of my data using linear discriminant analysis. I worked out the following example code in Venables and Ripley It does not seem to be happy with it. ============================ library(MASS) library(stats) data(iris3) ir<-rbind(iris3[,,1],iris3[,,2],iris3[,,3]) ir.species<-factor(c(rep("s",50),rep("c",50),rep("v",50)))
2005 Jan 21
2
cross validation
How to select training data set and test data set from the original data for performing cross-validation --------------------------------- [[alternative HTML version deleted]]
2009 Apr 11
2
leave-one-out in R
Hi Everyone, I am new in using R and I was wondering if anybody knows how to do a leave-one-out cross-validation in R. Thanks Charles Découvrez les photos les plus intéressantes du jour. [[alternative HTML version deleted]]
2005 Jul 27
1
how to get actual value from predict in nnet?
Dear All, After followed the help of nnet, I could get the networks trained and, excitedly, get the prediction for other samples. It is a two classes data set, I used "N" and "P" to label the two. My question is, how do I get the predicted numerical value for each sample? Not just give me the label(either "N" or "P")? Thanks! FYI: The nnet example I
2009 Dec 31
1
cross validation for species distribution
Dear, I wanna make cross-validation for the species data of species distribution models. Please kindly suggest any package containing cross validation suiting the purpose. Thank you. Elaine [[alternative HTML version deleted]]
2006 Oct 08
0
Problem in getting 632plus error using randomForest by ipred!
Hello! I'm Taeho, a graduate student in South Korea. In order to get .632+ bootstrap error using random forest, I have tried to use 'ipred' package; more specifically the function 'errorest' has been used. Following the guidelines, I made a simple command line like below: error<-errorest(class ~ ., data=data, model=randomForest, estimator = "632plus")$err
2009 Jan 22
4
dimnames in pkg "ipred"
Hello List, I`m trying to make prediction using a bagged tree with the package ipred. I tried to follow the manual but I`m getting an error message. Also browsing through the list-archive I didn`t find any hint. Maybe someone can help me? selbag <- bagging(SOIL_UNIT ~., data=traindat.bin, coob=TRUE) Error in dimnames(X) <- list(dn[[1L]], unlist(collabs, use.names = FALSE)) :
2003 Apr 16
2
Jackknife and rpart
Hi, First, thanks to those who helped me see my gross misunderstanding of randomForest. I worked through a baging tutorial and now understand the "many tree" approach. However, it is not what I want to do! My bagged errors are accpetable but I need to use the actual tree and need a single tree application. I am using rpart for a classification tree but am interested in a more unbaised
2007 Apr 20
3
Hi
Please add me to mailing list. regards Astha ************************************** See what's free at http://www.aol.com. [[alternative HTML version deleted]]
2002 Apr 10
1
New Package: ipred - Improved predictors
The package ipred is uploaded to CRAN. The main focus of the package is the calculation of improved predictors in classification tasks. Misclassification error can be improved by bootstrap aggregated classification trees and/or the framework of indirect classification. Furthermore, a unified interface for the estimation of misclassification error completes the features of ipred. We try to make