Hi, I'm using nnet to work on a 2 class classification problem. The result of my code is data.frame of true class, predicted class and associated probability. One way of summarizing the data is by a confusion matrix. However are there any graphical ways I could represent the data - specifically, I'd like to show the probabilities associated with each member of my prediction set? (I would rather not simply list the probabilities in a table) Thanks, ------------------------------------------------------------------- Rajarshi Guha <rxg218 at psu.edu> <http://jijo.cjb.net> GPG Fingerprint: 0CCA 8EE2 2EEB 25E2 AB04 06F7 1BB9 E634 9B87 56EE ------------------------------------------------------------------- Entropy isn't what it used to be.
Not exactly sure what you want, but you might want to look at the margin() function and the associated plot() method for margin objects in the randokmForest package. (You won't be able to use them directly on an nnet object, but the code should help.) Cheers, Andy> From: Rajarshi Guha > > Hi, > I'm using nnet to work on a 2 class classification problem. > The result > of my code is data.frame of true class, predicted class and associated > probability. > > One way of summarizing the data is by a confusion matrix. However are > there any graphical ways I could represent the data - > specifically, I'd > like to show the probabilities associated with each member of my > prediction set? > > (I would rather not simply list the probabilities in a table) > > Thanks, > > ------------------------------------------------------------------- > Rajarshi Guha <rxg218 at psu.edu> <http://jijo.cjb.net> > GPG Fingerprint: 0CCA 8EE2 2EEB 25E2 AB04 06F7 1BB9 E634 9B87 56EE > ------------------------------------------------------------------- > Entropy isn't what it used to be. > > ______________________________________________ > R-help at stat.math.ethz.ch mailing list > https://www.stat.math.ethz.ch/mailman/listinfo/r-help > PLEASE do read the posting guide! > http://www.R-project.org/posting-guide.html > >
If the input variables to your network are continuous you can visualize the relationship between two input variables and the resulting output (class probability) with image() or persp(). Here is an example (you need the mlbench package from CRAN to run this): library(mlbench) x <- as.data.frame(mlbench.spirals(400,cycles=1.5,sd=.1)) plot(x$x.1,x$x.2,col=unclass(x$classes)) nn1 <- nnet(classes ~ x.1 + x.2, data = x, size=20) xval <- seq(-1.5,1.5,length=100) map <- outer(xval,xval,FUN=function(x,y) {predict(nn1,data.frame(x.1=x,x.2=y))}) image(map) par("usr"=c(-1.5,1.5,-1.5,1.5)) points(x$x.1,x$x.2,pch=as.numeric(x$classes)+15,col=as.numeric(x$classes)+4) ### or use: persp(z=map,expand=.3,shade=.7,col="orange",phi=45,theta=180) If you have more than 2 input variables you can keep the other ones at fixed levels and see what happens. hth, Martin Keller-Ressel