During a R session (Version 1.4.1 under Windows) I ended up getting the following error message: Error: vector memory exhausted (limit reached?) independently of the command like rm(something), q(), ls(), gc(), memory.size(), memory.limit(300000000), ... What could I do now to save my data? How could I prevent this situation? Thanks for any hint, Nikolaus Hansen -.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.- r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html Send "info", "help", or "[un]subscribe" (in the "body", not the subject !) To: r-help-request at stat.math.ethz.ch _._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._
Nikolaus Hansen wrote:> > During a R session (Version 1.4.1 under Windows) I ended up getting the following error message:R-1.5.1 is recent.> Error: vector memory exhausted (limit reached?) > > independently of the command like rm(something), q(), ls(), gc(), memory.size(), memory.limit(300000000),?memory.limit says: Usage: memory.limit(size = NA) Arguments: size [...] request a new limit, in Mb. ^^ So something like memory.limit(500) is more reasonable, depending on the size of RAM in your machine. It is much more convenient to set --mex-mem-size=500M at the command line, how it works is described in the R for Windows FAQs.> What could I do now to save my data? How could I prevent this situation?a) Be sure a reasonable amount of memory was really allocated. b) Rewrite your R code to consume less memory and think about what is needed in memory at one time. Can you divide the computations into "smaller" ones releted to memory usage? c) Buy some more RAM. d) Wait ten years for a much bigger computer. ;-) Uwe Ligges -.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.- r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html Send "info", "help", or "[un]subscribe" (in the "body", not the subject !) To: r-help-request at stat.math.ethz.ch _._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._
I have run into similar problems occasionally on Windoze NT. For whatever reason, R refuses to do anything, including q("no"). The only way out was to kill it via Task Manager. Not sure how to make it reproducible. Unfortunately Uwe's comments are not exactly helpful, though mostly true. Cheers, Andy> -----Original Message----- > From: Uwe Ligges [mailto:ligges at statistik.uni-dortmund.de] > Sent: Saturday, August 17, 2002 12:27 PM > To: Nikolaus Hansen > Cc: r-help at stat.math.ethz.ch > Subject: Re: [R] Out of memory > > > Nikolaus Hansen wrote: > > > > During a R session (Version 1.4.1 under Windows) I ended up > getting the following error message: > > R-1.5.1 is recent. > > > > Error: vector memory exhausted (limit reached?) > > > > independently of the command like rm(something), q(), ls(), > gc(), memory.size(), memory.limit(300000000), > > ?memory.limit says: > > Usage: memory.limit(size = NA) > Arguments: size [...] request a new limit, in Mb. > ^^ > So something like memory.limit(500) is more reasonable, > depending on the > size of RAM in your machine. > It is much more convenient to set --mex-mem-size=500M at the command > line, how it works is described in the R for Windows FAQs. > > > > What could I do now to save my data? How could I prevent > this situation? > > a) Be sure a reasonable amount of memory was really allocated. > b) Rewrite your R code to consume less memory and think about what is > needed in memory at one time. Can you divide the computations into > "smaller" ones releted to memory usage? > c) Buy some more RAM. > d) Wait ten years for a much bigger computer. ;-) > > Uwe Ligges > -.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-. > -.-.-.-.-.-.-.-.- > r-help mailing list -- Read > http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html > Send "info", "help", or "[un]subscribe" > (in the "body", not the subject !) To: > r-help-request at stat.math.ethz.ch > _._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._. > _._._._._._._._._ >------------------------------------------------------------------------------ Notice: This e-mail message, together with any attachments, contains information of Merck & Co., Inc. (Whitehouse Station, New Jersey, USA) that may be confidential, proprietary copyrighted and/or legally privileged, and is intended solely for the use of the individual or entity named in this message. If you are not the intended recipient, and have received this message in error, please immediately return this by e-mail and then delete it. ============================================================================= -.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.- r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html Send "info", "help", or "[un]subscribe" (in the "body", not the subject !) To: r-help-request at stat.math.ethz.ch _._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._
> > But you did not comment all the reasonable commands I typed in > > (Andy added on another one) without getting a reasonable > > response from R. > > Why not? Thus it seemed to me that you would expect and accept that > > kind of behaviour of R: E.g. loading some data into memory > > which exceed > > the recent memory size will result in refusing any further command > > (more precise: getting a prompt and an invariable response to any > > typed command). This is not exactly helpful... > > Would you prefer the process to be terminated?No. Even tough in effect there were no difference.> You used up all the > resources allocated to R. None could be freed by garbage collection. > What was R supposed to do? There was no memory left to do > anything else > in.Easy to say what I would expect from the users point of view: An error message if I try to use too much memory and afterwards the possibility to free memory, e.g. by rm(some.of.my.data) and/or do a garbage collection. I have no idea if this would be difficult to ensure under any circumstances.> > I think this circumstance is very rare (normally you get enough memory > back to do something). I would prefer it to allowing R to > take over all > the VM on a computer, when you probably need to reboot. Those > who have not > seen this happen on a departmental server (say) should > consider themselves > fortunate.Yes, I do. Thanks for the reply, Nikolaus Hansen -.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.- r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html Send "info", "help", or "[un]subscribe" (in the "body", not the subject !) To: r-help-request at stat.math.ethz.ch _._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._