hello, I am using R1.2.2 on win98 with 64 meg of RAM I try to read a big csv file and get the error : a <- read.csv("c:/all2.csv",header=T) Error: vector memory exhausted (limit reached?) In addition: Warning message: Reached total allocation of 16Mb: see help(memory.size) Lost warning messages from there on, nothing works File-Exit, ^F4, q() , top right cross, the same message keep on appearing and the only solution is to close down the computer. Is this a bug? Thanks for any help. R. Heberto Ghezzo Ph.D. Meakins-Christie Labs McGill University Montreal - Canada heberto at meakins.lan.mcgill.ca -.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.- r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html Send "info", "help", or "[un]subscribe" (in the "body", not the subject !) To: r-help-request at stat.math.ethz.ch _._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._
On Thu, 17 May 2001, Heberto Ghezzo wrote:> hello, > I am using R1.2.2 on win98 with 64 meg of RAM > I try to read a big csv file and get the error : > > a <- read.csv("c:/all2.csv",header=T) > Error: vector memory exhausted (limit reached?) > In addition: Warning message: > Reached total allocation of 16Mb: see help(memory.size) > Lost warning messages > > from there on, nothing works > File-Exit, ^F4, q() , top right cross, the same message keep on > appearing and the only solution is to close down the computer. > Is this a bug?Well, there's clearly a bug involved. You could make a good case for the bug being in the operating system, though -- you should certainly be able to kill the R process without restarting the computer. I think that Windows is prepared to give away more memory than is good for it, leaving the operating system with no room to work. The --max-mem-size option is designed to stop this: it specifies the maximum amount of memory R can ask for. It sounds like your file is too big for read.csv in the memory you have. I believe R 1.3.0 has an improved read.table, which might help. -thomas -.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.- r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html Send "info", "help", or "[un]subscribe" (in the "body", not the subject !) To: r-help-request at stat.math.ethz.ch _._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._
I am running R-1.2.3 on Unix: >uname -a SunOS fluke 5.7 Generic_106541-12 sun4d sparc SUNW,SPARCserver-1000 At startup using XEmacs I am stumped by: Error: an xdr real data read error occured Fatal error: unable to restore saved data (remove .RData or increase memory) Process R exited abnormally with code 2 at Mon Jun 11 07:53:36 2001 My .RData is 9'683'000 Bytes long. According to R FAQ the memory management should be automatically extending space for R. R version 1.2.0 introduces a new "generational" garbage collector, which will increase the memory available to R as needed. Hence, user intervention is no longer necessary for ensuring that enough memory is available. Since my project is still growing I will need more than 10MB in the future. What can I do apart from remove .RData ? How do I increase memory? The FAQ seems silent on this. Thanks for help. -christian Dr.sc.math.Christian W. Hoffmann Mathematics and Statistical Computing Landscape Modeling and Web Applications Swiss Federal Research Institute WSL Zuercherstrasse 111 CH-8903 Birmensdorf, Switzerland phone: ++41-1-739 22 77 fax: ++41-1-739 22 15 e-mail: christian.hoffmann_at_wsl.ch__prevent_spamming www: http://www.wsl.ch/staff/christian.hoffmann/ -.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.- r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html Send "info", "help", or "[un]subscribe" (in the "body", not the subject !) To: r-help-request at stat.math.ethz.ch _._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._