Displaying 20 results from an estimated 1000 matches similar to: "pls regression - optimal number of LVs"
2003 Jun 18
2
install pls.pcr package
How do you install a package from CRAN ? I want to install pls.pcr, so I have downloaded pls.pcr_0.1.1.tar.gz but when I try to install it using the install package(s) from local zip file(s) option it says :
> install.packages(choose.files('',filters=Filters[c('zip','All'),]), .libPaths()[1], CRAN = NULL)
Error in file(file, "r") : unable to open connection
2003 Jul 23
0
pls.pcr compared to Unscrambler
Dear R-helpers,
Has anybody ever tried to compare pls regression outputs from the pls.pcr R-package developped by Wehrens with outputs from the Unscrambler software developped by CAMO company ?
I find very different outputs and wonder if this comes from differences between methods/algorithms SIMPLS (pls.pcr) and PLS1 (Unscrambler).
Arnaud
*************************
Arnaud DOWKIW
Department of
2003 Jun 26
2
Change default parameters of panel.smooth
Hello,
can anyome tell me how to access the full script of the panel.smooth function so that I can change the thickness of the smoothing line or its colour ?
All I could access is :
> panel.smooth
function (x, y, col = par("col"), bg = NA, pch = par("pch"),
cex = 1, col.smooth = "red", span = 2/3, iter = 3, ...)
{
points(x, y, pch = pch, col = col, bg =
2003 Jun 25
2
Pairs with different colours
Does anybody know how to make pairs graphics with dots of different colours depending on the value of a categorical variable ?
Thanks,
Arnaud
*************************
Arnaud DOWKIW
Department of Primary Industries
J. Bjelke-Petersen Research Station
KINGAROY, QLD 4610
Australia
T : + 61 7 41 600 700
T : + 61 7 41 600 728 (direct)
F : + 61 7 41 600 760
**************************
2003 May 06
1
S's plclust and R's hclust
Hello everyone,
Does anyone know how to implement the argument "unit" in R's plclust
function ? I used to use Splus where this argument exists but it has not
been implemented in R's plclust. The reason why I switched from Splus to
R is that Ward's method is not implemented for S's hclust whereas it is
implemented for R's hclust. What I would need is S's plclust
2003 May 09
1
Principal coordinates analysis
Dear all,
Does anyone know how to run Principal Coordinates Analysis (PCoA) from a
squared euclidean dissimilarity matrix with R ?
Thanks,
*************************
Arnaud DOWKIW
Department of Primary Industries
J. Bjelke-Petersen Research Station
KINGAROY, QLD 4610
Australia
T : + 61 7 41 600 700
T : + 61 7 41 600 728 (direct)
F : + 61 7 41 600 760
**************************
2006 Jul 06
1
PLS method
dear all,
I am a new comer to R and statistic. Now I have a little confuse about the
package pls.
I have to use 5 components to form a model. There are strong relationship
between some of the components, which leads to the changes of the sign of
each coeficeince, of course this is unwanted when using the normal
regression way. So I choose the way of PLS, which is good at solve this kind
of
2005 Jun 06
1
Help package pls.pcr
Hello!
I need help to use the package pls.pcr in R.
I installed R in an IRIX 6.5, using the version of R 0.64.1 from
sgifreeware(I didn't get to install the newest version using make). I
need to use the package pls.pcr and when I give the command:
# R
R : Copyright 1999, The R Development Core Team
Version 0.64.1 (May 8, 1999)
R is free software and comes with ABSOLUTELY NO
2007 Oct 23
1
Compute R2 and Q2 in PLS with pls.pcr package
Dear list
I am using the mvr function of the package pls.pcr to compute PLS
resgression using a X matrix of gene expression variables and a Y matrix
of medical varaibles.
I would like to obtain the R2 (sum of squares captured by the model) and
Q2 (proportion of total sum of squares captured in leave-one-out cross
validation) of the model.
I am not sure if there are specific slots in the
2005 Mar 05
1
partial r2 using PLS
I'm trying to get the coefficient of partial determination for each of
three independent variables. I've tried mvr in package pls.pcr. I'm a
little confused by the output. I'm curious how I can order the LV's
according to their names rather than their relative contribution to
the regression.
For instance, using the crabs data from MASS I made a regression of FL~RW+noise
2011 Oct 18
1
problem in exceuting PLS
Hi
I'm performing a PLS
This is my data present in a file
Year Y X2 X3 X4 X5 X6
1960 27.8 397.5 42.2 50.7 78.3 65.8
1960 29.9 413.3 38.1 52 79.2 66.9
1961 29.8 439.2 40.3 54 79.2 67.8
1961 30.8 459.7 39.5 55.3 79.2 69.6
1962 31.2 492.9 37.3 54.7 77.4 68.7
My R-code
Data <- read.csv("C:/TestData.csv")
variable=names(Data)[4:8]
dataset=NULL
dataset$X=NULL
len=length(variable)
2005 May 12
1
pls -- crossval vs plsr(..., CV=TRUE)
Hi,
Newbie question about the pls package.
Setup:
Mac OS 10.3.9
R: Aqua GUI 1.01, v 2.0.1
I want to get R^2 and Q^2 (LOO and Leave-10-Out) values for each
component for my model.
I was running into a few problems so I played with the example a little
and the results do not match up with the comments
in the help pages.
$ library(pls)
$ data(NIR)
$ testing.plsNOCV <- plsr(y ~ X, 6, data =
2011 Oct 21
1
use of segments in PLS
How to use the segments in the PLS
fit1 <- mvr(formula=Y~X1+X2+X3+X4+x5+....+x27, data=Dataset, comp=5,segment
=7 )
here when i use segments,the error was like this
rror in mvrCv(X, Y, ncomp, method = method, scale = sdscale, ...) :
argument 7 matches multiple formal arguments
Please help
--
View this message in context:
2007 Nov 26
1
mvr error in PLS package
All,
I have been using a data set to build pls models for three different soil properties. Two of the three models run fine; however I receive the following error for the final model.
> libs.IC.cal <- mvr(libs.IC.fmla, data = libsdata.cond.cal, ncomp=20,validation = "LOO", method = "oscorespls")
Error in colMeans(x, n, prod(dn), na.rm) :
'x' must
2013 Jul 13
1
Alternative to eval(cl, parent.frame()) ?
Dear developeRs,
I maintain a package 'pls', which has a main fit function mvr(), and
functions plsr() and pcr() which are meant to take the same arguments as
mvr() and do exactly the same, but have different default values for the
'method' argument. The three functions are all exported from the name
space.
In the 'pre namespace' era, I took inspiration from lm() and
2011 Oct 18
1
getting p-value and standard error in PLS
Hi
How to get p-value and the standard error in PLS
I have used the following function to calculate PLS
fit1 <- mvr(formula=Y~X1+X2+X3+X4, data=Dataset, comp=4)
Please help me
--
View this message in context: http://r.789695.n4.nabble.com/getting-p-value-and-standard-error-in-PLS-tp3914760p3914760.html
Sent from the R help mailing list archive at Nabble.com.
2004 Feb 01
5
Stepwise regression and PLS
Dear all,
I am a newcomer to R. I intend to using R to do stepwise regression and
PLS with a data set (a 55x20 matrix, with one dependent and 19
independent variable). Based on the same data set, I have done the same
work using SPSS and SAS. However, there is much difference between the
results obtained by R and SPSS or SAS.
In the case of stepwise, SPSS gave out a model with 4 independent
2011 Oct 24
2
How to get intecerpt standard error in PLS
Hi
how do we get intercepts standard error. I'm using the package pls.
i got the coefficient but not able to get the stabdard error
--
View this message in context: http://r.789695.n4.nabble.com/How-to-get-intecerpt-standard-error-in-PLS-tp3932104p3932104.html
Sent from the R help mailing list archive at Nabble.com.
2005 Oct 11
0
pls version 1.1-0
Version 1.1-0 of the pls package is now available on CRAN.
The pls package implements partial least squares regression (PLSR) and
principal component regression (PCR). Features of the package include
- Several plsr algorithms: orthogonal scores, kernel pls and simpls
- Flexible cross-validation
- A formula interface, with traditional methods like predict, coef,
plot and summary
- Functions
2005 Oct 11
0
pls version 1.1-0
Version 1.1-0 of the pls package is now available on CRAN.
The pls package implements partial least squares regression (PLSR) and
principal component regression (PCR). Features of the package include
- Several plsr algorithms: orthogonal scores, kernel pls and simpls
- Flexible cross-validation
- A formula interface, with traditional methods like predict, coef,
plot and summary
- Functions