I am doing cluster analysis on 8768 respondents on 5 lifestyle variables and am having difficulty constructing a dissimilarity matrix which I will use for PAM. I always get an error: “cannot allocate vector of size 293.3 Mb” even if I have already increased my memory to its limit of 4000. I did it on 2GB , 32-bit OS . I tried ff and filehash and I still get the same error. Can you please help me? [[alternative HTML version deleted]]
Hi Penny, Could you provide the code you are using? (also, using a subject to the e-mail, would have been nice :) ) Tal ----------------Contact Details:------------------------------------------------------- Contact me: Tal.Galili@gmail.com | 972-52-7275845 Read me: www.talgalili.com (Hebrew) | www.biostatistics.co.il (Hebrew) | www.r-statistics.com (English) ---------------------------------------------------------------------------------------------- On Fri, Oct 22, 2010 at 2:54 PM, Penny Adversario <penadv@yahoo.com> wrote:> I am doing cluster analysis on 8768 respondents on 5 lifestyle variables > and am having difficulty constructing a dissimilarity matrix which I will > use for PAM. I always get an error: “cannot allocate vector of size 293.3 > Mb” even if I have already increased my memory to its limit of 4000. I did > it on 2GB , 32-bit OS . I tried ff and filehash and I still get the same > error. Can you please help me? > > > > [[alternative HTML version deleted]] > > > ______________________________________________ > R-help@r-project.org mailing list > https://stat.ethz.ch/mailman/listinfo/r-help > PLEASE do read the posting guide > http://www.R-project.org/posting-guide.html > and provide commented, minimal, self-contained, reproducible code. > >[[alternative HTML version deleted]]
On Fri, 2010-10-22 at 05:54 -0700, Penny Adversario wrote:> I am doing cluster analysis on 8768 respondents on 5 lifestyle > variables and am having difficulty constructing a dissimilarity matrix > which I will use for PAM. I always get an error: ?cannot allocate > vector of size 293.3 Mb? even if I have already increased my memory to > its limit of 4000. I did it on 2GB , 32-bit OS . I tried ff and > filehash and I still get the same error. Can you please help me?You probably don't have quite enough RAM on your system. I can manage this on a similar sized data set as yours using dummy data (rnorm(8768*5)) with 3.7Gb of RAM available on Linux. Did you leave all the other arguments unchanged? df <- data.frame(matrix(rnorm(8678*5), ncol = 5)) require(cluster) foo <- pam(df, k = 4) works OK for me. HTH G -- %~%~%~%~%~%~%~%~%~%~%~%~%~%~%~%~%~%~%~%~%~%~%~%~%~%~%~%~%~%~%~%~%~%~% Dr. Gavin Simpson [t] +44 (0)20 7679 0522 ECRC, UCL Geography, [f] +44 (0)20 7679 0565 Pearson Building, [e] gavin.simpsonATNOSPAMucl.ac.uk Gower Street, London [w] http://www.ucl.ac.uk/~ucfagls/ UK. WC1E 6BT. [w] http://www.freshwaters.org.uk %~%~%~%~%~%~%~%~%~%~%~%~%~%~%~%~%~%~%~%~%~%~%~%~%~%~%~%~%~%~%~%~%~%~%