here is my system memory: ronggui at 0[ronggui]$ free total used free shared buffers cached Mem: 256728 79440 177288 0 2296 36136 -/+ buffers/cache: 41008 215720 Swap: 481908 60524 421384 and i want to cluster my data using hclust.my data has 3 variables and 10000 cases.but it fails and saying have not enough memory for the vector size. I read the help doc and use $R --max-vsize=800M to start the R 2.1.0beta under debian linux.but it still can not get the solution.so is my pc'memory not enough to carry this analysis or my mistake on setting the memory? thank you.
How much memory is free when R fails (e.g., what does "top" show while trying to run your clustering)? If there's still a sizeable amount of free memory you may have to look into the system limits, maximum data segment size in particular. Many Linux distros have it set to "unlimited" but default Debian may not. If this turns out to be the problem, please do not, _do not_ raise it to "unlimited," but only to enough for R to work. hth, jon b On Wed, 30 Mar 2005 18:36:37 +0800 ronggui <0034058 at fudan.edu.cn> wrote:> here is my system memory: > ronggui at 0[ronggui]$ free > total used free shared buffers cached > Mem: 256728 79440 177288 0 2296 36136 > -/+ buffers/cache: 41008 215720 > Swap: 481908 60524 421384 > > and i want to cluster my data using hclust.my data has 3 variables and 10000 cases.but it fails and saying have not enough memory for the vector size. I read the help doc and use $R --max-vsize=800M to start the R 2.1.0beta under debian linux.but it still can not get the solution.so is my pc'memory not enough to carry this analysis or my mistake on setting the memory? > > thank you. > > ______________________________________________ > R-help at stat.math.ethz.ch mailing list > https://stat.ethz.ch/mailman/listinfo/r-help > PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html
hclust creates a distance matrix. In your case it is 10,000 x 10,000. For various reasons several copies are created, so you probably need at least 100M x 8 bytes per entry x 3 copies = 2.4 GB just for the distance matrix. If you don't have that much RAM the computation will probably take longer than you're willing to wait. Reid Huntsinger -----Original Message----- From: r-help-bounces at stat.math.ethz.ch [mailto:r-help-bounces at stat.math.ethz.ch] On Behalf Of ronggui Sent: Wednesday, March 30, 2005 5:37 AM To: r-help at stat.math.ethz.ch Subject: [R] about memory here is my system memory: ronggui at 0[ronggui]$ free total used free shared buffers cached Mem: 256728 79440 177288 0 2296 36136 -/+ buffers/cache: 41008 215720 Swap: 481908 60524 421384 and i want to cluster my data using hclust.my data has 3 variables and 10000 cases.but it fails and saying have not enough memory for the vector size. I read the help doc and use $R --max-vsize=800M to start the R 2.1.0beta under debian linux.but it still can not get the solution.so is my pc'memory not enough to carry this analysis or my mistake on setting the memory? thank you. ______________________________________________ R-help at stat.math.ethz.ch mailing list https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html
Reasonably Related Threads
- [LLVMdev] -fplugin-arg-dragonegg-enable-gcc-optzns status
- [LLVMdev] -fplugin-arg-dragonegg-enable-gcc-optzns status
- How to bind the oracle 9i data file to zfs volumes
- Using both starttls and ssl in passdb on proxy results in timeouts
- patch to add a menu item in Rgui for RSiteSearch