Martin M has suggested I widen this discussion to R-devel, and> I agree that we should increase them, > but I'm not sure at all about the amount. > > The default could even depend on the architecture (via "./configure")..Views, please. ------------- Begin Forwarded Message ------------- Is is not time we increased the defaults a bit? As the base gets bigger I hit 200k cons cells rather frequently. And 2Mb of heap seems low compared to 3Mb of cons cells and 1.8Mb for the R binary. How little memory do people have these data? Except possibly on Windows and old teaching labs I would have thought using 15Mb default for R was very reasonable, which is about --vsize 6Mb --nsize 300k On Solaris that gives: PID USERNAME THR PRI NICE SIZE RES STATE TIME CPU COMMAND 9308 ripley 1 -25 0 15M 8360K sleep 0:02 4.35% R.binary as against the default PID USERNAME THR PRI NICE SIZE RES STATE TIME CPU COMMAND 9309 ripley 1 -25 0 9664K 6416K sleep 0:02 4.66% R.binary and that extra user memory makes a lot of difference. ------------- End Forwarded Message ------------- -- Brian D. Ripley, ripley@stats.ox.ac.uk Professor of Applied Statistics, http://www.stats.ox.ac.uk/~ripley/ University of Oxford, Tel: +44 1865 272861 (self) 1 South Parks Road, +44 1865 272860 (secr) Oxford OX1 3TG, UK Fax: +44 1865 272595 -.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.- r-devel mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html Send "info", "help", or "[un]subscribe" (in the "body", not the subject !) To: r-devel-request@stat.math.ethz.ch _._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._
I would agree that: - it is reasonable to increase the default vsize and nsize - the default sizes should probably be architecture dependent - If possible, the default sizes should also be picked up from ${HOME}/.Renviron. For example, look for R_VSIZE and R_NSIZE environment variables and use them I think that 16M for vsize and 800k for nsize could be good global defaults on many architectures. -.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.- r-devel mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html Send "info", "help", or "[un]subscribe" (in the "body", not the subject !) To: r-devel-request@stat.math.ethz.ch _._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._
> To: Prof Brian Ripley <ripley@stats.ox.ac.uk> > Cc: R-devel@stat.math.ethz.ch > Subject: Re: --nsize and --vsize > From: Douglas Bates <bates@stat.wisc.edu> > Date: 12 Apr 1999 09:17:33 -0500 > > I would agree that: > - it is reasonable to increase the default vsize and nsize > - the default sizes should probably be architecture dependent > - If possible, the default sizes should also be picked up from > ${HOME}/.Renviron. For example, look for R_VSIZE and R_NSIZE > environment variables and use them > > I think that 16M for vsize and 800k for nsize could be good global > defaults on many architectures.But not I think for multi-user machines. That is 35Mb allocated (on Solaris) and I would not want our students taking 35Mb for each R job, as the current memory management really needs most of it resident. About half that would be tolerable, hence my original figures. We would like them to use perhaps 50Mb in total. Also, that is too much for 64Mb Windows NT machines, and that is high-end for teaching labs. -- Brian D. Ripley, ripley@stats.ox.ac.uk Professor of Applied Statistics, http://www.stats.ox.ac.uk/~ripley/ University of Oxford, Tel: +44 1865 272861 (self) 1 South Parks Road, +44 1865 272860 (secr) Oxford OX1 3TG, UK Fax: +44 1865 272595 -.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.- r-devel mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html Send "info", "help", or "[un]subscribe" (in the "body", not the subject !) To: r-devel-request@stat.math.ethz.ch _._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._
> Date: Tue, 13 Apr 1999 11:39:00 +0100 (BST) > From: Nicholas Lee <N.J.Lee@statslab.cam.ac.uk> > > On Tue, 13 Apr 1999, Prof Brian D Ripley wrote: > > > Well, the flags exist now. The `potentially wasteful' part is running out > > The flags exist, but in the adage of the lazy programmer, "less > keystrokes, means less RSI/bugs." One nice option would be for R to probe > the size of the .RData file, and rather than dieing on you if it's too > large, giving you option of loading it with XX extra memory.Fine. Please write the code to do this and submit it to us. (It would be good to see your name in ?contributors.) For the rest (why 15M?, best policy), please catch up with your reading of R-devel. -- Brian D. Ripley, ripley@stats.ox.ac.uk Professor of Applied Statistics, http://www.stats.ox.ac.uk/~ripley/ University of Oxford, Tel: +44 1865 272861 (self) 1 South Parks Road, +44 1865 272860 (secr) Oxford OX1 3TG, UK Fax: +44 1865 272595 -.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.- r-devel mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html Send "info", "help", or "[un]subscribe" (in the "body", not the subject !) To: r-devel-request@stat.math.ethz.ch _._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._