Hi , I ''ve got a trouble with using R. When I want to load a file that contains 93 thousand raws and 22 colums of data (essentially float) R shows me this error message "heap size trouble" Does anyone could tell me what parameter shall I precise before launching R in order to load my big file. Thanks a lot -------------- next part -------------- A non-text attachment was scrubbed... Name: karamian.vcf Type: text/x-vcard Size: 331 bytes Desc: Card for karamian Url : https://stat.ethz.ch/pipermail/r-help/attachments/20000530/94a555d3/karamian.vcf
Karamian, have a look at "help(memory)" ... You can enlarge the memory reserved for R by specifyng the command line arguments "--vsize" and "--nsize". When using ESS, type "C-u M-x R" and ESS will prompt for arguments passed to R. Uli Flenker Institute of Biochemistry German Sports University Cologne Carl-Diem-Weg 6 50933 Cologne / Germany Phone 0049/0221/4982-493 On Tue, 30 May 2000, karamian wrote:> Hi , > > I ''ve got a trouble with using R. > > When I want to load a file that contains 93 thousand raws and 22 colums > of data (essentially float) > R shows me this error message > > "heap size trouble" > > Does anyone could tell me what parameter shall I precise before > launching R in order to load my big file. > > Thanks a lot > > >-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.- r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html Send "info", "help", or "[un]subscribe" (in the "body", not the subject !) To: r-help-request at stat.math.ethz.ch _._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._
You need to change R_VSIZE in .Renviron I belived, at least if you are using Unix. Faheem. Ie R_SIZE=somethingM where something is a number currently larger than what you are using, and M stands for megs. eg R_SIZE=35M You can use gc() to see your current memory situation. Ie. mine gives:> gc()free total (Mb) <- currently available Ncells 839332 1024000 19.6 Vcells 3258006 4587520 35.0 This is all for Unix, might differ for other platforms. In any case look at the section in the FAQ about memory or do> help(Memory)I''ve found memory problems with large data sets in R a big pain. Sometimes I just run out of memory and have to give up. I wish there was a magical way arond this, but there doesn''t seem to be. Yours, Faheem. On Tue, 30 May 2000, karamian wrote:> Hi , > > I ''ve got a trouble with using R. > > When I want to load a file that contains 93 thousand raws and 22 colums > of data (essentially float) > R shows me this error message > > "heap size trouble" > > Does anyone could tell me what parameter shall I precise before > launching R in order to load my big file. > > Thanks a lot > > >-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.- r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html Send "info", "help", or "[un]subscribe" (in the "body", not the subject !) To: r-help-request at stat.math.ethz.ch _._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._
Oooops, its "help(Memory)" not "help(memory)"! ~ Sorry for this ... Uli Flenker Institute of Biochemistry German Sports University Cologne Carl-Diem-Weg 6 50933 Cologne / Germany Phone 0049/0221/4982-493 On Tue, 30 May 2000, Uli Flenker; Raum 704 wrote:> Karamian, > have a look at "help(memory)" ... > > You can enlarge the memory reserved for R by specifyng the command line > arguments "--vsize" and "--nsize". When using ESS, type "C-u M-x R" and > ESS will prompt for arguments passed to R. > > > Uli Flenker > Institute of Biochemistry > German Sports University Cologne > Carl-Diem-Weg 6 > > 50933 Cologne / Germany > > Phone 0049/0221/4982-493 > > > On Tue, 30 May 2000, karamian wrote: > > > Hi , > > > > I ''ve got a trouble with using R. > > > > When I want to load a file that contains 93 thousand raws and 22 colums > > of data (essentially float) > > R shows me this error message > > > > "heap size trouble" > > > > Does anyone could tell me what parameter shall I precise before > > launching R in order to load my big file. > > > > Thanks a lot > > > > > > > > -.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.- > r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html > Send "info", "help", or "[un]subscribe" > (in the "body", not the subject !) To: r-help-request at stat.math.ethz.ch > _._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._ >-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.- r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html Send "info", "help", or "[un]subscribe" (in the "body", not the subject !) To: r-help-request at stat.math.ethz.ch _._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._
karamian wrote: ...I want to load a file that contains 93 thousand raws and 22 colums of data (essentially float)... I just had to process over 199000 records with four numeric values. If I remember correctly, I used: --vsize 30M --nsize 500000 which pretty much ate all the RAM (64M) I had. Don''t forget to "rm" big data sets before you exit, or R will bomb when you next try to load without the increased memory. Just reread from the data file when you need them again (and it helps to exit other apps before starting R to avoid disk thrashing). Jim -.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.- r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html Send "info", "help", or "[un]subscribe" (in the "body", not the subject !) To: r-help-request at stat.math.ethz.ch _._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._
Jim Lemon <bitwrit at ozemail.com.au> writes:> karamian wrote: > > ...I want to load a file that contains 93 thousand raws and 22 colums of > data (essentially float)... > > I just had to process over 199000 records with four numeric values. If > I remember correctly, I used: > > --vsize 30M --nsize 500000 > > which pretty much ate all the RAM (64M) I had. Don''t forget to "rm" big > data sets before you exit, or R will bomb when you next try to load > without the increased memory. Just reread from the data file when you > need them again (and it helps to exit other apps before starting R to > avoid disk thrashing).Another approach is to use a relational database to store such a large table and load the table into R from the database. There are several interfaces into R from relational databases. -.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.- r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html Send "info", "help", or "[un]subscribe" (in the "body", not the subject !) To: r-help-request at stat.math.ethz.ch _._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._
Doesn't using an external DB just delay the virtual memory issue? Eventually when the data is manipulated in R, the memory requirement is still the same - or am I missing something? John Strumila -----Original Message----- From: Douglas Bates [mailto:bates at stat.wisc.edu] Sent: Wednesday, 31 May 2000 23:36 To: Jim Lemon Cc: r-help at stat.math.ethz.ch Subject: Re: [R] heap size trouble Jim Lemon <bitwrit at ozemail.com.au> writes:> karamian wrote: > > ...I want to load a file that contains 93 thousand raws and 22 colums of > data (essentially float)... > > I just had to process over 199000 records with four numeric values. If > I remember correctly, I used: > > --vsize 30M --nsize 500000 > > which pretty much ate all the RAM (64M) I had. Don't forget to "rm" big > data sets before you exit, or R will bomb when you next try to load > without the increased memory. Just reread from the data file when you > need them again (and it helps to exit other apps before starting R to > avoid disk thrashing).Another approach is to use a relational database to store such a large table and load the table into R from the database. There are several interfaces into R from relational databases. -.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-. -.- r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html Send "info", "help", or "[un]subscribe" (in the "body", not the subject !) To: r-help-request at stat.math.ethz.ch _._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._. _._ -.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.- r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html Send "info", "help", or "[un]subscribe" (in the "body", not the subject !) To: r-help-request at stat.math.ethz.ch _._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._