cdhershberger@dow.com
2001-Apr-17 03:34 UTC
[Rd] cannot allocate vector of size 71773 Kb (PR#915)
Full_Name: Doug Hershberger Version: 1.2.2 OS: Red Hat-Linux 7.0 Submission from: (NULL) (216.99.65.36) In the R FAQ I find the following entry explaining that R no longer has problems with memory. http://cran.r-project.org/doc/FAQ/R-FAQ.html#Why%20does%20R%20run%20out%20of%20of%20memory%3f However in my installation: R Version 1.2.2 (2001-02-26) Installed from the red hat RPM on your site on a Red Hat 7.0 i686 I get the following error when working with large data sets: > source("/usr/local/genex/rcluster/lib/rcluster/r/hcluster.r"); > breadth.program("uploaded_data.txt", "average", 10) Read 2 items Read 8574 items Error: cannot allocate vector of size 71773 Kb Execution halted Is there a way to fix this? This is running as part of the GeneX installation so I would have to dig through to figure out how to give R more memory. Besides which I don't think that will work. Thanks for any insight that you can provide. -.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.- r-devel mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html Send "info", "help", or "[un]subscribe" (in the "body", not the subject !) To: r-devel-request@stat.math.ethz.ch _._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._
cdhershberger@dow.com wrote:> > Full_Name: Doug Hershberger > Version: 1.2.2 > OS: Red Hat-Linux 7.0 > Submission from: (NULL) (216.99.65.36) > > In the R FAQ I find the following entry explaining that R no longer has > problems with memory. > > http://cran.r-project.org/doc/FAQ/R-FAQ.html#Why%20does%20R%20run%20out%20of%20of%20memory%3f > > However in my installation: R Version 1.2.2 (2001-02-26) > > Installed from the red hat RPM on your site on a Red Hat 7.0 i686 > > I get the following error when working with large data sets: > > > source("/usr/local/genex/rcluster/lib/rcluster/r/hcluster.r"); > > breadth.program("uploaded_data.txt", "average", 10) > Read 2 items > Read 8574 items > Error: cannot allocate vector of size 71773 Kb > Execution halted > > Is there a way to fix this? This is running as part of the GeneX > installation so I would have to dig through to figure out how to give R > more memory. > > Besides which I don't think that will work. > > Thanks for any insight that you can provide.That is NOT A BUG! Have a look at ?Memory Start R with --max-mem-size=xxxM Uwe Ligges -.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.- r-devel mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html Send "info", "help", or "[un]subscribe" (in the "body", not the subject !) To: r-devel-request@stat.math.ethz.ch _._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._
ripley@stats.ox.ac.uk
2001-Apr-17 06:17 UTC
[Rd] cannot allocate vector of size 71773 Kb (PR#915)
On Tue, 17 Apr 2001 cdhershberger@dow.com wrote:> Full_Name: Doug Hershberger > Version: 1.2.2 > OS: Red Hat-Linux 7.0 > Submission from: (NULL) (216.99.65.36) > > > In the R FAQ I find the following entry explaining that R no longer has > problems with memory.No longer has the same problems.... There is also a section on `What is a bug?' that you should read again.> http://cran.r-project.org/doc/FAQ/R-FAQ.html#Why%20does%20R%20run%20out%20of%20of%20memory%3f > > However in my installation: R Version 1.2.2 (2001-02-26) > > Installed from the red hat RPM on your site on a Red Hat 7.0 i686 > > I get the following error when working with large data sets: > > > source("/usr/local/genex/rcluster/lib/rcluster/r/hcluster.r"); > > breadth.program("uploaded_data.txt", "average", 10) > Read 2 items > Read 8574 items > Error: cannot allocate vector of size 71773 Kb > Execution halted > > Is there a way to fix this? This is running as part of the GeneXInstall more memory, or write better R code, or use better statistical methods.> installation so I would have to dig through to figure out how to give R > more memory. > > Besides which I don't think that will work.We would need to know a lot more about the code to see the problem, but as a quick guess, clustering 8574 items with the standard algorithms takes a very large amount of memory (on any system), so it would be the third alternative. -- Brian D. Ripley, ripley@stats.ox.ac.uk Professor of Applied Statistics, http://www.stats.ox.ac.uk/~ripley/ University of Oxford, Tel: +44 1865 272861 (self) 1 South Parks Road, +44 1865 272860 (secr) Oxford OX1 3TG, UK Fax: +44 1865 272595 -.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.- r-devel mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html Send "info", "help", or "[un]subscribe" (in the "body", not the subject !) To: r-devel-request@stat.math.ethz.ch _._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._
Erich Neuwirth
2001-Apr-17 10:24 UTC
[Rd] converting body of a function to a a character vector
have a look at the following session:> fff<-function(x)+ { + x*x + }> ffffunction(x) { x*x }> > body(fff){ x * x }> as.character(body(fff))[1] "{" "x * x" somehow, the closing parenthesis gets lost in the conversion. i need this stuff because i want to get the code of a function as a vector of strings so i can get it into excel with my interface package. -- Erich Neuwirth, Computer Supported Didactics Working Group Visit our SunSITE at http://sunsite.univie.ac.at Phone: +43-1-4277-38624 Fax: +43-1-4277-9386 -.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.- r-devel mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html Send "info", "help", or "[un]subscribe" (in the "body", not the subject !) To: r-devel-request@stat.math.ethz.ch _._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._
> I get the following error when working with large data sets: > > > source("/usr/local/genex/rcluster/lib/rcluster/r/hcluster.r"); > > breadth.program("uploaded_data.txt", "average", 10) > Read 2 items > Read 8574 items > Error: cannot allocate vector of size 71773 Kb > Execution halted > > Is there a way to fix this? This is running as part of the GeneX >installation so I would have to dig through to figure out how to give R >more memory.In R 1.2.2 this message means the operating system is not letting R have the memory. You do not need parameters on the R command line (and it seems better not to have them) but you do need to do some things in the operating system before you start R. In Unix/Linux you first need to check that your datasize and stacksize limits are not set, as they usually are. Use limit or unlimit depending on you shell or OS. You then need to have adequate swap space. Physical memory will make things faster, but is not necessary. You may need a very large swap space if you are going to do much with an 80M vector, but perhaps it gets broken into smaller pieces once you get it loaded. Given current memory prices you should probably consider more physical memory, but I expect you will need several gigs of virtual memory to do much work with an 80M vector. Also beware, as I recall, mkswap in Linux defaults to an older format for swap and the newer format is faster. Paul Gibert -.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.- r-devel mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html Send "info", "help", or "[un]subscribe" (in the "body", not the subject !) To: r-devel-request@stat.math.ethz.ch _._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._