Hello, I have trouble with my cluster analysis using package "cluster". "diana" and "agnes" both seem to try to allocate memory directly, so I can not use virtual memory of my Windows2000 operation system. I do have 320 MB of memory. But they claim about 600 MB. Do I have a chance to do the analysis with my amount of memory. Thanks for all comments, I did not find a way yet. Regards Georg [[alternative HTML version deleted]]
Dear Georg, You may prefer to try clara() which uses less memory than the other cluster routines (and stands for "clustering large applications"). The documentation for clara() says that all variables must be numeric, which could be a problem if you have nominal or ordinal variables. Regards, Andrew C. Ward CAPE Centre Department of Chemical Engineering The University of Queensland Brisbane Qld 4072 Australia andreww at cheque.uq.edu.au Quoting gowuban <gowuban at web.de>:> Hello, > > I have trouble with my cluster analysis using package > "cluster". "diana" and "agnes" both seem to try to > allocate memory directly, so I can not use virtual memory > of my Windows2000 operation system. > I do have 320 MB of memory. But they claim about 600 MB. > Do I have a chance to do the analysis with my amount of > memory. > Thanks for all comments, I did not find a way yet. > > Regards > Georg > > [[alternative HTML version deleted]] > > ______________________________________________ > R-help at stat.math.ethz.ch mailing list > https://www.stat.math.ethz.ch/mailman/listinfo/r-help >
Have you read the rw-FAQ and set the amount of virtual memory available to the R process via --memory-size? [I doubt it! Please do read the FAQs.] On Sun, 10 Aug 2003, gowuban wrote:> I have trouble with my cluster analysis using package "cluster". "diana" > and "agnes" both seem to try to allocate memory directly, so I can not > use virtual memory of my Windows2000 operation system. I do have 320 MB > of memory. But they claim about 600 MB. Do I have a chance to do the > analysis with my amount of memory. Thanks for all comments, I did not > find a way yet.-- Brian D. Ripley, ripley at stats.ox.ac.uk Professor of Applied Statistics, http://www.stats.ox.ac.uk/~ripley/ University of Oxford, Tel: +44 1865 272861 (self) 1 South Parks Road, +44 1865 272866 (PA) Oxford OX1 3TG, UK Fax: +44 1865 272595
On Sun, 10 Aug 2003, Andrew C. Ward wrote:> You may prefer to try clara() which uses less memory than > the other cluster routines (and stands for "clustering > large applications"). The documentation for clara() says > that all variables must be numeric, which could be a > problem if you have nominal or ordinal variables.Clara is a partitioning method, not a hierarchical method of clustering like agnes and diana. To use clara you have to know how many clusters you want (and also want the sort of clusters it generates, spherical ones, like kmeans).> Quoting gowuban <gowuban at web.de>: > > > I have trouble with my cluster analysis using package > > "cluster". "diana" and "agnes" both seem to try to > > allocate memory directly, so I can not use virtual memory > > of my Windows2000 operation system. > > I do have 320 MB of memory. But they claim about 600 MB. > > Do I have a chance to do the analysis with my amount of > > memory.-- Brian D. Ripley, ripley at stats.ox.ac.uk Professor of Applied Statistics, http://www.stats.ox.ac.uk/~ripley/ University of Oxford, Tel: +44 1865 272861 (self) 1 South Parks Road, +44 1865 272866 (PA) Oxford OX1 3TG, UK Fax: +44 1865 272595