similar to: prcomp - principal components in R

Displaying 20 results from an estimated 2000 matches similar to: "prcomp - principal components in R"

2004 Nov 03
2
Princomp(), prcomp() and loadings()
In comparing the results of princomp and prcomp I find: 1. The reported standard deviations are similar but about 1% from each other, which seems well above round-off error. 2. princomp returns what I understand are variances and cumulative variances accounted for by each principal component which are all equal. "SS loadings" is always 1. 3. Same happens
2005 Mar 26
5
PCA - princomp can only be used with more units than variables
Hi all: I am trying to do PCA on the following matrix. N1 N2 A1 A2 B1 B2 gene_a 90 110 190 210 290 310 gene_b 190 210 390 410 590 610 gene_c 90 110 110 90 120 80 gene_d 200 100 400 90 600 200 >dataf<-read.table("matrix") >
2004 Feb 17
1
Comparison of % variance explained by each PC before AND after rotation
Hello again- Thanks to Prof. Ripley for responding to my previous question. I would like to clarify my question using sample code. I will use some sample code taken from ?prcomp Again, I would like to compare the % variance explained by each PC before and after rotation. < code follows > data(USArrests) pca = prcomp(USArrests, scale = TRUE) # proportion variance explained by each
2009 Mar 31
3
Factor Analysis Output from R and SAS
Dear Users, I ran factor analysis using R and SAS. However, I had different outputs from R and SAS. Why they provide different outputs? Especially, the factor loadings are different. I did real dataset(n=264), however, I had an extremely different from R and SAS. Why this things happened? Which software is correct on? Thanks in advance, - TY #R code with example data # A little
2000 Apr 26
1
Factor Rotation
How does one rotate the loadings from a principal component analysis? Help on function prcomp() from package mva mentions rotation: Arguments retx a logical value indicating whether the rotated variables should be returned. Values rotation the matrix of variable loadings (i.e., a matrix whose olumns contain the eigenvectors). The function princomp returns this in the element
2012 Oct 19
1
factor score from PCA
Hi everyone, I am trying to get the factor score for each individual case from a principal component analysis, as I understand, both princomp() and prcomp() can not produce this factor score, the principal() in psych package has this option: scores=T, but after running the code, I could not figure out how to show the factor score results. Here is my code, could anyone give me some advice please?
2010 Apr 02
2
Biplot for PCA using labdsv package
Hi everyone, I am doing PCA with labdsv package. I was trying to create a biplot graphs in order to observe arrows related to my variables. However when I run the script for this graph, the console just keep saying: *Error in nrow(y) : element 1 is empty; the part of the args list of 'dim' being evaluated was: (x)* could please someone tell me what this means? what i am doing
2006 Dec 05
1
problem with lists...
Hi guys, I am new to R, so sorry if my problem seems trivial. Sometimes I encounter some lists, which I cannot index their components with [ . ] For instance the prcomp() function returns a 'prcomp' object whose components are some 'lists'. the second component is a list that comtains the following: > mylist <- churn[2] > class(mylist) [1] "list" >
2011 Jan 26
1
Factor rotation (e.g., oblimin, varimax) and PCA
A bit of a newbee to R and factor rotation I am trying to understand factor rotations and their implementation in R, particularly the GPArotation library. I have tried to reproduce some of the examples that I have found, e.g., I have taken the values from Jacksons example in "Oblimin Rotation", Encyclopedia of Biostatistics
2008 Jan 18
2
plotting other axes for PCA
Hi R-community, I am doing a PCA and I need plots for different combinations of axes (e.g., PC1 vs PC3, and PC2 vs PC3) with the arrows indicating the loadings of each variables. What I need is exactly what I get using biplot (pca.object) but for other axes. I have plotted PC2 and 3 using the scores of the cases, but I don't get the arrows proportional to the loadings of each variables on
2010 Jun 16
2
Accessing the elements of summary(prcomp(USArrests))
Hello again, I was hoping one of you could help me with this problem. Consider the sample data from R: > summary(prcomp(USArrests)) Importance of components: PC1 PC2 PC3 PC4 Standard deviation 83.732 14.2124 6.4894 2.48279 Proportion of Variance 0.966 0.0278 0.0058 0.00085 Cumulative Proportion 0.966 0.9933 0.9991 1.00000 How do I access the
2000 Oct 03
3
prcomp compared to SPAD
Hi ! I've used the example given in the documentation for the prcomp function both in R and SPAD to compare the results obtained. Surprisingly, I do not obtain the same results for the coordinates of the principal composantes with these two softwares. using USArrests data I obtain with R : > summary(prcomp(USArrests)) Importance of components: PC1 PC2
2008 Jun 11
3
Finding Coordinate of Max/Min Value in a Data Frame
Hi, Suppose I have the following data frame. __BEGIN__ > library(MASS) > data(crabs) > crab.pca <- prcomp(crabs[,4:8],retx=TRUE) > crab.pca$rotation PC1 PC2 PC3 PC4 PC5 FL 0.2889810 0.3232500 -0.5071698 0.7342907 0.1248816 RW 0.1972824 0.8647159 0.4141356 -0.1483092 -0.1408623 CL 0.5993986 -0.1982263 -0.1753299 -0.1435941 -0.7416656 CW
2016 Mar 24
3
summary( prcomp(*, tol = .) ) -- and 'rank.'
Following from the R-help thread of March 22 on "Memory usage in prcomp", I've started looking into adding an optional 'rank.' argument to prcomp allowing to more efficiently get only a few PCs instead of the full p PCs, say when p = 1000 and you know you only want 5 PCs. (https://stat.ethz.ch/pipermail/r-help/2016-March/437228.html As it was mentioned, we already
2006 May 25
1
PC rotation question
On p. 48 of "Statistics Complements" to the 3rd MASS edition, http://www.stats.ox.ac.uk/pub/MASS3/VR3stat.pdf I read that the orthogonal rotations of Z Lambda^-1 remain uncorrelated, where Z is the PC and Lambda is the diag matrix of singular values. However, the example below that text is > A <- loadings(ir.pca) %*% diag(ir.pca$sdev) If ir.pca$sdev are the singular values,
2009 Jan 13
1
PCA loadings differ vastly!
hi, I have two questions: #first (SPSS vs. R): I just compared the output of different PCA routines in R (pca, prcomp, princomp) with results from SPSS. the loadings of the variables differ vastly! in SPSS the variables load constantly higher than in R. I made sure that both progr. use the correlation matrix as basis. I found the same problem with rotated values (varimax rotation and rtex=T
2016 Mar 24
3
summary( prcomp(*, tol = .) ) -- and 'rank.'
I agree with Kasper, this is a 'big' issue. Does your method of taking only n PCs reduce the load on memory? The new addition to the summary looks like a good idea, but Proportion of Variance as you describe it may be confusing to new users. Am I correct in saying Proportion of variance describes the amount of variance with respect to the number of components the user chooses to show? So
2005 Nov 18
1
pr[in]comp: predict single observation when data has colnames (PR#8324)
To my knowledge, this has not been reported previously, and doesn't seem to have been changed in R-devel or R-patched. If M is a matrix with coloumn names, and mod <- prcomp(M) # or princomp then predicting a single observation (row) with predict() gives the error Error in scale.default(newdata, object$center, object$scale) : length of 'center' must equal the number of
2009 Mar 10
1
Using napredict in prcomp
Hello all, I wish to compute site scores using PCA (prcomp) on a matrix with missing values, for example: Drain Slope OrgL a 4 1 NA b 2.5 39 6 c 6 8 45 d 3 9 12 e 3 16 4 ... Where a,b... are sites. The command > pca<-prcomp(~ Drain + Slope + OrgL, data = t, center = TRUE, scale = TRUE, na.action=na.exclude) works great, and from
2007 Jun 27
1
Condensed PCA Results
Hello all, I'm currently using R to do PCA Analysis, and was wondering if anyone knew the specific R Code that could limit the output of the PCA Analysis so that you only get the Principal Component features as your output and none of the extraneous words or numbers that you don't want. If that was unclear, let me use linear regression as an example: "lm(y~x)" is the normal