similar to: r-help

Displaying 20 results from an estimated 10000 matches similar to: "r-help"

2004 Nov 15
0
how to obtain predicted labels for test data using "kerne lpls"
You need to do some extra work if you want to do classification with a regression method. One simple way to do classification with PLS is to code the classes as 0s and 1s (assuming there are only two classes) or -1s and 1s, fit the model, then threshold the prediction; e.g., those with predicted values < 0.5 (in the 0/1 coding) get labeled as 0s. There's a predict() method for mvr
2004 Nov 15
0
how to obtain predicted labels for test data using "kernelpls"
Dear members, My name is Seungho Huh. I am a statistician who tries to use the Kernel PLS method in a classification problem. I am sending this email to ask you something about the "kernelpls" function in R (pls.pcr package). I would like to obtain the predicted Y values for test data, using the Kernel PLS method. Let's take the example in the R help: > data(NIR) >
2010 Apr 20
1
Help with Partial dependence bar graph
Hello, I need to draw a partial dependence bar graph. My the my predictor vectors are continous and so is the response variable. Iam using the partialPlot function of the randomForest package. I get a line graph. How can I edit it to get a bar graph instead? (partialPlot(randomForest object ,data-matrix, number of predictor vectors, "Temp")) -- Daudi Jjingo [[alternative HTML
2007 Oct 23
1
Compute R2 and Q2 in PLS with pls.pcr package
Dear list I am using the mvr function of the package pls.pcr to compute PLS resgression using a X matrix of gene expression variables and a Y matrix of medical varaibles. I would like to obtain the R2 (sum of squares captured by the model) and Q2 (proportion of total sum of squares captured in leave-one-out cross validation) of the model. I am not sure if there are specific slots in the
2004 Aug 27
1
predict.mvr error message
What version of R, what version of pls.pcr, and on what OS? Have you checked whether your versions of software are up to date? I get: > n <- 1350 > p <- 180 > y <- rnorm(n) > x <- matrix(sample(0:1, n*p, replace=TRUE), n, p) > fit <- mvr(x, y, method="SIMPLS", validat="none", ncomp=2) > xt <- matrix(sample(0:1, 312*p, replace=TRUE), 312,
2011 Oct 18
1
problem in exceuting PLS
Hi I'm performing a PLS This is my data present in a file Year Y X2 X3 X4 X5 X6 1960 27.8 397.5 42.2 50.7 78.3 65.8 1960 29.9 413.3 38.1 52 79.2 66.9 1961 29.8 439.2 40.3 54 79.2 67.8 1961 30.8 459.7 39.5 55.3 79.2 69.6 1962 31.2 492.9 37.3 54.7 77.4 68.7 My R-code Data <- read.csv("C:/TestData.csv") variable=names(Data)[4:8] dataset=NULL dataset$X=NULL len=length(variable)
2007 Nov 26
1
mvr error in PLS package
All, I have been using a data set to build pls models for three different soil properties. Two of the three models run fine; however I receive the following error for the final model. > libs.IC.cal <- mvr(libs.IC.fmla, data = libsdata.cond.cal, ncomp=20,validation = "LOO", method = "oscorespls") Error in colMeans(x, n, prod(dn), na.rm) : 'x' must
2011 Oct 21
1
use of segments in PLS
How to use the segments in the PLS fit1 <- mvr(formula=Y~X1+X2+X3+X4+x5+....+x27, data=Dataset, comp=5,segment =7 ) here when i use segments,the error was like this rror in mvrCv(X, Y, ncomp, method = method, scale = sdscale, ...) : argument 7 matches multiple formal arguments Please help -- View this message in context:
2011 Nov 30
1
Invalid number of components, ncomp
Error in mvr(Kd_nM ~ qsar, ncomp = 6, data = my, validation = "CV", method = "kernelpls") :   Invalid number of components, ncomp How I can fix this? [[alternative HTML version deleted]]
2010 Jul 07
2
R2 function from PLS to use a model on test data
Hello, I am having some trouble using a model I created from plsr (of train) to analyze each invididual R^2 of the 10 components against the test data. For example: mice1 <- plsr(response ~factors, ncomp=10 data=MiceTrain) R2(mice1) ##this provides the correct R2 for the Train data for 10 components ## Now my next objective is to calculate my model's R2 for each component on the
2005 Jul 05
1
PLS: problem transforming scores to variable space
Dear List! I am trying to calculate the distance between original data points and their position in the PLS model. In order to do this, I tried to predict the scores using the predict.mvr function and calculate the corresponding positions in variable space. The prediction of scores works perfectly: ------ data(trees) # build model t<-plsr(Volume~.,data=trees) # predict scores for training
2011 Apr 18
2
Predicting with a principal component regression model: "non-conformable arguments" error
Hello all, I have generated a principal components regression model using the pcr() function from the PLS package (R version 2.12.0). I am getting a "non-conformable arguments" error when I try to use the predict() function on new data, but only when I try to read in the new data from a separate file. More specifically, when my data looks like this #########training data
2005 Oct 11
0
pls version 1.1-0
Version 1.1-0 of the pls package is now available on CRAN. The pls package implements partial least squares regression (PLSR) and principal component regression (PCR). Features of the package include - Several plsr algorithms: orthogonal scores, kernel pls and simpls - Flexible cross-validation - A formula interface, with traditional methods like predict, coef, plot and summary - Functions
2005 Oct 11
0
pls version 1.1-0
Version 1.1-0 of the pls package is now available on CRAN. The pls package implements partial least squares regression (PLSR) and principal component regression (PCR). Features of the package include - Several plsr algorithms: orthogonal scores, kernel pls and simpls - Flexible cross-validation - A formula interface, with traditional methods like predict, coef, plot and summary - Functions
2009 Oct 01
0
Confidence intervals PLS prediction
I have switched from The Unscrambler to R for pls regression analysis and have been able to calculate scores, coefficients, RMSEP from a large number of PLS1 and PLS2 models. The ultimate goal is to use these models for predicting unknown samples, which again is straight-forward with the built-in predict() function. However, I?m struggling with prediction uncertainty (i.e. confidence intervals) on
2005 Sep 04
2
Help: PLSR
Hello, I have a data set with 15 variables (first one is the response) and 1200 observations. Now I use pls package to do the plsr as below. trainSet = as.data.frame(scale(trainSet, center = T, scale = T)) trainSet.plsr = mvr(formula, ncomp = 14, data = trainSet, method = "kernelpls", model = TRUE, x = TRUE, y = TRUE) from the model, I wish to know the
2007 Oct 26
0
pls version 2.1-0
Version 2.1-0 of the pls package is now available on CRAN. The pls package implements partial least squares regression (PLSR) and principal component regression (PCR). Features of the package include - Several plsr algorithms: orthogonal scores, kernel pls, wide kernel pls, and simpls - Flexible cross-validation - A formula interface, with traditional methods like predict, coef, plot and
2007 Oct 26
0
pls version 2.1-0
Version 2.1-0 of the pls package is now available on CRAN. The pls package implements partial least squares regression (PLSR) and principal component regression (PCR). Features of the package include - Several plsr algorithms: orthogonal scores, kernel pls, wide kernel pls, and simpls - Flexible cross-validation - A formula interface, with traditional methods like predict, coef, plot and
2007 Jan 02
0
pls version 2.0-0
Version 2.0-0 of the pls package is now available on CRAN. The pls package implements partial least squares regression (PLSR) and principal component regression (PCR). Features of the package include - Several plsr algorithms: orthogonal scores, kernel pls and simpls - Flexible cross-validation - A formula interface, with traditional methods like predict, coef, plot and summary - Functions
2007 Jan 02
0
pls version 2.0-0
Version 2.0-0 of the pls package is now available on CRAN. The pls package implements partial least squares regression (PLSR) and principal component regression (PCR). Features of the package include - Several plsr algorithms: orthogonal scores, kernel pls and simpls - Flexible cross-validation - A formula interface, with traditional methods like predict, coef, plot and summary - Functions