I am a lurker on the mailing list and a quiet but active user of R. I recently needed to do some exploration and modelling on a large set of data. In the past I would have been sceptical about how well R's memory management and garbage collection would perform. I fully expected R or my operating system to grind to a crashing halt. However I was delighted with the memory performance. Garbage *was* collected. The memory footprint of the process did not explode. I got answers (and even the answers I wanted) in a remarkably short amount of time. I was able to do my work on a modestly configured machine (Linux, but that is less important) with 256Mb of memory. I know that in the past these sames types of analyses would have played havoc on a machine with 2Gb of memory. So thank you to the people who have finally tamed the memory monster (you know who you are, and I think I do too). Best Regards, --Mike Mike Meyer, Salter Point Associates, Seattle WA -.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.- r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html Send "info", "help", or "[un]subscribe" (in the "body", not the subject !) To: r-help-request at stat.math.ethz.ch _._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._