Dear all, I observed that the R process is growing in time (I was working with the same session for two days, and the size reached 300 Mb). Explicit calls to 'gc()' did show a trigger around 90 Mb but did not change the RSS (while I remembered that my linux was nicely behaving before and the process size was shrinking when R was freeing memory...). I ended up saving the session, ending it then starting it again (and the process size was around 90 Mb). I do not know whether this was caused by the recent changes in the garbage collection, or by possible leaks in the libraries/functions I have been using (mainly 'hclust' and 'cmdscale'). Did anybody experienced something similar ? L. -.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.- r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html Send "info", "help", or "[un]subscribe" (in the "body", not the subject !) To: r-help-request at stat.math.ethz.ch _._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._
Laurent Gautier <laurent at cbs.dtu.dk> writes:> Dear all, > > I observed that the R process is growing in time (I was working with > the same session for two days, and the size reached 300 Mb). Explicit > calls to 'gc()' did show a trigger around 90 Mb but did not change > the RSS (while I remembered that my linux was nicely behaving before > and the process size was shrinking when R was freeing memory...). I > ended up saving the session, ending it then starting it again (and > the process size was around 90 Mb). > > I do not know whether this was caused by the recent changes in the > garbage collection, or by possible leaks in the libraries/functions > I have been using (mainly 'hclust' and 'cmdscale'). > > Did anybody experienced something similar ?Achim found a rather nasty memory leak, traced to the deparsing code. You might want to try the current r-patched snapshot and see if your problem goes away. -- O__ ---- Peter Dalgaard Blegdamsvej 3 c/ /'_ --- Dept. of Biostatistics 2200 Cph. N (*) \(*) -- University of Copenhagen Denmark Ph: (+45) 35327918 ~~~~~~~~~~ - (p.dalgaard at biostat.ku.dk) FAX: (+45) 35327907 -.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.- r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html Send "info", "help", or "[un]subscribe" (in the "body", not the subject !) To: r-help-request at stat.math.ethz.ch _._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._
Hello Laurent, Sunday, October 13, 2002, 7:33:47 PM, you wrote: LG> Dear all, LG> I observed that the R process is growing in time (I was working with LG> the same session for two days, and the size reached 300 Mb). Explicit LG> calls to 'gc()' did show a trigger around 90 Mb but did not change LG> the RSS (while I remembered that my linux was nicely behaving before LG> and the process size was shrinking when R was freeing memory...). I LG> ended up saving the session, ending it then starting it again (and LG> the process size was around 90 Mb). LG> I do not know whether this was caused by the recent changes in the LG> garbage collection, or by possible leaks in the libraries/functions LG> I have been using (mainly 'hclust' and 'cmdscale'). LG> Did anybody experienced something similar ? Hello --- I think I did notice something similar, while working with R ver. 1060 on Windows 98. I remember, that I was working on some graphs. I didn't perform any `heavy' calculations, no large datasets. I was producing the graphs many times though while working on details. I didn't run any applications other than R and a text editor. After an hour or so of work I wanted to go to Desktop by pressing a button "show desktop" by the "Start" button. I received something like "There isn't enough memory to perform this operation". Everything was OK when I restarted R. I didn't know about the gc() function, so I don't have any `numbers'. I thought it occured because of `instability' of Windows, but now, if it happens on other systems as well Windows might not be the only problem. mb. ~,~`~,~`~,~`~,~`~,~`~,~`~,~`~,~ Michal Bojanowski Institute for Social Studies University of Warsaw http://www.iss.uw.edu.pl -.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.- r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html Send "info", "help", or "[un]subscribe" (in the "body", not the subject !) To: r-help-request at stat.math.ethz.ch _._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._
> > Did anybody experienced something similar ? >I have experienced similar problems with R 1.6.0. I have simulation code (using nothing but functions in base and in nlme) that had run without problem throughout the 1.5.x series and as soon as I upgraded, the code continued to consume physical and virtual memory during processing until it brought the system to a near deadlock. J.R. Lockwood 412-683-2300 x4941 lockwood at rand.org http://www.rand.org/methodology/stat/members/lockwood/ -.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.- r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html Send "info", "help", or "[un]subscribe" (in the "body", not the subject !) To: r-help-request at stat.math.ethz.ch _._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._