Displaying 20 results from an estimated 35 matches for "communality".
Did you mean:
commonality
2003 Jan 03
4
factor analysis (pca): how to get the 'communalities'?
...is going behind the scene?
- Or what I am doing wrong in my use of R?
- If I am doing the pca correct, can I use the R results as equally aceptable
without further discussion?
Maybe a different 'hidden' algorithm is the reason for different results?
Q2. How to get the so called 'Communality Estimates' with R?
Here the values reported by SPSS for the above test data.frame m:
Communality Estimates as percentages:
1 88,619
2 76,855
3 89,167
4 85,324
5 76,043
6 84,012
7 80,223
8 92,668
9 63,297
10 88,786
Any help, suggestions or hints are very welcome.
Best regard...
2003 Jan 04
0
factor analysis (pca): how to get the 'communal
On 4 Jan 2003 at 12:51, Wolfgang Lindner wrote:
> Please excuse me, if the following questions are *too* off-topic, but I found it
> interesting. In inspecting your code I came across an R feature, I could not
> find in the online manuals:
>
> Q1. Looking at the left-handside in your function def:
>
> "cov.cor" <- function ( covmat ) {
>
2004 Jan 03
0
KSY, in the communal
conquest mayer died snorkel dupont impish spangle clergyman cantaloupe
carnival address extensible decennial general secretary baleful
bitch edit qatar corvallis demand sand
2009 Mar 31
3
Factor Analysis Output from R and SAS
...ta=fact rotate=varimax method=p nfactors=3;
var v1-v6;
run;
/* Output from SAS*/
The FACTOR
Procedure
Initial Factor Method: Principal
Components
Prior Communality Estimates:
ONE
Eigenvalues of the Correlation Matrix:
Total = 6 Average = 1
Eigenvalue Difference
Proportion Cumulative
1 3.69603077 2.62291629
0.6160 0.6160...
2009 Oct 07
0
how to extract the second table from the factanal functions result's loadings part?
Hi All,
Can someone help me?The way to do this may be very easy but i do not know.
*Question1:----*
factanal() function produces the results in this way:--
*RESULTS:--*
*>fact1<- factanal(data_withNA,factors=1,rotation="none")
>fact1$"loadings"*
Loadings:
Factor1
i1 0.784
i2 0.874
i3 0.786
i4 0.839
i5 0.778
i6 0.859
i7 0.850
i8 0.763
i9 0.810
i10 0.575
2012 Jun 14
0
Qualified Commercial Window Cleaning Services in London
Fast Window Cleaning London (http://www.fastwindowcleaning.co.uk) offers skilled Communal Hallways Cleaning Services in West London (http://www.fastwindowcleaning.co.uk/commercial-window-cleaning-communal-hallways-cleaning-london-your-window-cleaners-london/) since 1998 and can easily find the right solutions to any cleaning demands.
Do not hesitate to contact us and get, for no obligation, a
2014 Jun 13
2
[LLVMdev] Looking for a fix to memory leak in DWARF support
David, (and everyone else)
I am forced to do some maintenance work on a fairly old LLVM branch
(likely based on release 3.1) that among other issues has a major problem
with memory leak somewhere around DWARF debug support.
In fact customer is unable to build with -g at all - simply running out of
memory on their project...
I seem to remember that there has been a major fix related to it,
2014 Jun 13
4
[LLVMdev] Looking for a fix to memory leak in DWARF support
David,
Thanks for the quick response...
No, at this point I am just getting into the issue... I assume it is a leak, but no clear proof yet. I was hoping it was an obvious thing since I recall a discussion about it a while ago... but maybe I am just confused.
Was your work for compressing DWARF data motivated by a certain inefficiency in debug info representation? Did it result in
2007 Nov 21
3
connection to IPC$ denied due to security descriptor
Hey all,
I have a fileserver running Debian Etch and Samba 3.0.24 that I use to serve
media and private home directories. I have a couple roommates, and
therefore have a couple accounts on the box for those users. I had
everything working perfectly until last week when my system drive took a
crap. I've reinstalled everything exactly the same (I think?) but now I am
having problems with
2011 Jan 26
1
Factor rotation (e.g., oblimin, varimax) and PCA
A bit of a newbee to R and factor rotation I am trying to understand
factor rotations and their implementation in R, particularly the
GPArotation library.
I have tried to reproduce some of the examples that I have found, e.g., I
have taken the values from Jacksons example in "Oblimin Rotation",
Encyclopedia of Biostatistics
2016 Apr 18
2
Google Chrome and CentOS 6?
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1
On 18/04/16 13:34, Phelps, Matthew wrote:
> Unfortunately, I have neither the time, nor expertise to partake in
> such a project.
is there someone else on here who can help ? I think if we can
demonstrate some traction, it would go a long way in both the upstream
engagement and the conversation with Red Hat - since we can then
demonstrate a
2002 Jun 05
5
Monte Carlo
Doctor in veterinary medecine I have a degree in veterinary epidemiology
I would ask if someone can perform Monte Carlo simulation with R.
Thanks in advance.
Dr Kane Ismaila
i_kane at hotmail.com
-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-
r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html
Send "info",
2014 Jun 13
2
[LLVMdev] Looking for a fix to memory leak in DWARF support
Thanks Eric,
They are doing LTO build but with some custom modifications (think a library at a time as opposed to a whole program). I must admit, it is a rather large application as well, so as expected, any inefficiencies are multiplied greatly.
>From little that I have seen so far, it looks like debug metadata for an IR object linger behind once the object itself is eliminated (optimized).
2014 Jun 13
2
[LLVMdev] Looking for a fix to memory leak in DWARF support
Eric,
Let me clarify it a bit... without type uniqueing for LTO + debug will I have a highly inefficient IR representation or incorrect debug info? If debug info for LTO is known to be non-useful or ambiguous or flat wrong - there is no point in fixing its emission... or will it still be practical and if I manage to improve it somewhat the customer will still have some value-add by using it?
2016 Apr 18
0
Google Chrome and CentOS 6?
On Mon, Apr 18, 2016 at 8:53 AM, Karanbir Singh <kbsingh at centos.org> wrote:
> -----BEGIN PGP SIGNED MESSAGE-----
> Hash: SHA1
>
> On 18/04/16 13:34, Phelps, Matthew wrote:
> > Unfortunately, I have neither the time, nor expertise to partake in
> > such a project.
>
> is there someone else on here who can help ? I think if we can
> demonstrate some
2003 Jun 16
0
new package: eha
A few days ago I uploaded to CRAN a new package called 'eha', which
stands for 'Event History Analysis'. Its main focus is on proportional
hazards modeling in survival analysis, and in that respect eha can
be regarded as a complement and an extension to the 'survival'
package. In fact eha requires survival. Eha contains three functions
for proportional hazards
2003 Jun 16
0
new package: eha
A few days ago I uploaded to CRAN a new package called 'eha', which
stands for 'Event History Analysis'. Its main focus is on proportional
hazards modeling in survival analysis, and in that respect eha can
be regarded as a complement and an extension to the 'survival'
package. In fact eha requires survival. Eha contains three functions
for proportional hazards
2004 Sep 10
1
R conversion
I am a newcomer to R trying to convert a SAS program to R.
Does anyone know if there is a functional equivalent of the SAS
'Factor' procedure?
For example in SAS:
proc factor DATA=cor method=principal rotate=varimax proportion=0.9 scree
where 'cor' is a correlation matrix (as in the R 'cor' function)
This should get you a list of eigen values as well as a factor
2008 Dec 01
1
factanal question
Dear R users:
I'm wondering if it's possible to get the residual correlation matrix when using factanal.
Since factanal assumes that the errors are normally distributed and independent (provided the factor model fits the data) this would be useful. Of course you would need to submit the data to the function to get the residuals (not just their correlation matrix), but it should be possible
2005 Jun 20
1
Factanal loadings as large as 1.2 with promax -- how unusual?
I am performing a large (105 variable) factor analysis with factanal,
specifying promax rotation. I kow that some loadings over 1.0 are not
unsual with that rotation, but I have some as large as 1.2, which seems
extreme. I am skirting the assumptions of the model by using responses
on a 7-point rating scale as data; I may have to go back and compute
polychoric correlations instead of product