ZABALZA-MEZGHANI Isabelle
2003-Sep-24 15:27 UTC
[R] Problem with memory for large datasets
Hello, I would like to know if there is a possibility to "clean" the R memory during a R session. In fact, I realize a lot of instruction with large objects (matrix of 500*5000), and I can not manage to achieve the end of my script due to memory lack. Of course, I've tried to remove all "temporary object" during the script execution and to perform a garbage collector ... But it seems to have no effect ... Any idea to solve this problem without an exit from R ? Regards Isabelle Isabelle Zabalza-Mezghani IFP - Reservoir Engineering Department Rueil Malmaison - France
ZABALZA-MEZGHANI Isabelle wrote:> Hello, > > I would like to know if there is a possibility to "clean" the R memory > during a R session. In fact, I realize a lot of instruction with large > objects (matrix of 500*5000), and I can not manage to achieve the end of my > script due to memory lack. Of course, I've tried to remove all "temporary > object" during the script execution and to perform a garbage collector ... > But it seems to have no effect ... > > Any idea to solve this problem without an exit from R ? > > Regards > > IsabelleAfter you have removed unnecessary objects, the only thing you can do is to increase the memory limit R uses (given you are on Windows). See ?memory.limit for details. Attention: raising it will cause your system to begin swapping heavily. The best solution is to buy some more memory and/or optimize your code (given that's possible). Uwe Ligges
What have you done to remove "temporary objects"? Does this include "rm(list='ls()')" or "remove(list=objects())"? hope this helps. spencer graves ZABALZA-MEZGHANI Isabelle wrote:>Hello, > >I would like to know if there is a possibility to "clean" the R memory >during a R session. In fact, I realize a lot of instruction with large >objects (matrix of 500*5000), and I can not manage to achieve the end of my >script due to memory lack. Of course, I've tried to remove all "temporary >object" during the script execution and to perform a garbage collector ... >But it seems to have no effect ... > >Any idea to solve this problem without an exit from R ? > >Regards > >Isabelle > >Isabelle Zabalza-Mezghani >IFP - Reservoir Engineering Department >Rueil Malmaison - France > >______________________________________________ >R-help at stat.math.ethz.ch mailing list >https://www.stat.math.ethz.ch/mailman/listinfo/r-help > >
Hi, running gc() helped me a lot Regards, Vladimir> -----Original Message----- > From: r-help-bounces at stat.math.ethz.ch > [mailto:r-help-bounces at stat.math.ethz.ch]On Behalf Of > ZABALZA-MEZGHANI Isabelle > Sent: Wednesday, September 24, 2003 7:27 PM > To: help R (E-mail) > Subject: [R] Problem with memory for large datasets > > > Hello, > > I would like to know if there is a possibility to "clean" the R memory > during a R session. In fact, I realize a lot of instruction with large > objects (matrix of 500*5000), and I can not manage to achieve the > end of my > script due to memory lack. Of course, I've tried to remove all "temporary > object" during the script execution and to perform a garbage collector ... > But it seems to have no effect ... > > Any idea to solve this problem without an exit from R ? > > Regards > > Isabelle > > Isabelle Zabalza-Mezghani > IFP - Reservoir Engineering Department > Rueil Malmaison - France > > ______________________________________________ > R-help at stat.math.ethz.ch mailing list > https://www.stat.math.ethz.ch/mailman/listinfo/r-help > >