Dear R-users, I am currently facing what appears to be a strange thing (at least to my humble understanding). If I understood correctly, starting with the version 1.2.3, R memory allocation can be done dynamically, and there is no need to fiddle with the --nsize and --vsize parameter any longer... So far this everything seemed to go this way (I saw the size of my processes growing when I was using big objects and so on). Howver recently I had trouble with the memory. It seems there is a limit of about 1,2 Go, beyond which R starts to send memory allocation error messages... not consistent with the memory still available (like 'Error: cannot allocate vector of size 125382 Kb', while there still about 17Go free). I thought default limitation were set, but it does not seem to be the case> mem.limits()nsize vsize NA NA Any idea ? Where am I wrong ? Laurent PS: I am currently using R-1.3.0-patched, compiled on SGI IRIX 6.5 (I was using 1.2.3 and had the same kind of problems, that's why I upgraded) -- Laurent Gautier CBS, Building 208, DTU PhD. Student D-2800 Lyngby,Denmark tel: +45 45 25 24 85 http://www.cbs.dtu.dk/laurent -------------- next part -------------- An HTML attachment was scrubbed... URL: https://stat.ethz.ch/pipermail/r-help/attachments/20010716/3516f1da/attachment.html
On Mon, 16 Jul 2001, Laurent Gautier wrote:> Dear R-users, > > > I am currently facing what appears to be a strange thing (at least to my > humble understanding). > > If I understood correctly, starting with the version 1.2.3, R memory > allocation can be done dynamically, > and there is no need to fiddle with the --nsize and --vsize parameter > any longer...starting with 1.2.0, yes.> > So far this everything seemed to go this way (I saw the size of my > processes growing when I was using big objects and > so on). Howver recently I had trouble with the memory. It seems there is > a limit of about 1,2 Go, beyond which R starts > to send memory allocation error messages... not consistent with the > memory still available > (like 'Error: cannot allocate vector of size 125382 Kb', while there > still about 17Go free).Is this compiled as 32-bit or 64-bit process? And are there any per-process limits set? A 32-bit process will (by definition) have a 4Gb limit, and may well have 2Gb or less depending how the malloc is organised.> I thought default limitation were set, but it does not seem to be the > case > > > mem.limits() > nsize vsize > NA NAThat's correct. See ?mem.limits.> > Any idea ? > > Where am I wrong ? > > > > Laurent > > > PS: I am currently using R-1.3.0-patched, compiled on SGI IRIX 6.5 (I > was using 1.2.3 and had the same kind of problems, that's why > I upgraded)No change in that area since 1.2.3.> > > > -- > Laurent Gautier CBS, Building 208, DTU > PhD. Student D-2800 Lyngby,Denmark > tel: +45 45 25 24 85 http://www.cbs.dtu.dk/laurent > > >-- Brian D. Ripley, ripley at stats.ox.ac.uk Professor of Applied Statistics, http://www.stats.ox.ac.uk/~ripley/ University of Oxford, Tel: +44 1865 272861 (self) 1 South Parks Road, +44 1865 272860 (secr) Oxford OX1 3TG, UK Fax: +44 1865 272595 -.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.- r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html Send "info", "help", or "[un]subscribe" (in the "body", not the subject !) To: r-help-request at stat.math.ethz.ch _._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._
On Mon, 16 Jul 2001, Laurent Gautier wrote:> Dear R-users, > > > I am currently facing what appears to be a strange thing (at least to my > humble understanding). > > If I understood correctly, starting with the version 1.2.3, R memory > allocation can be done dynamically, > and there is no need to fiddle with the --nsize and --vsize parameter > any longer... > > So far this everything seemed to go this way (I saw the size of my > processes growing when I was using big objects and > so on). Howver recently I had trouble with the memory. It seems there is > a limit of about 1,2 Go, beyond which R starts > to send memory allocation error messages... not consistent with the > memory still available > (like 'Error: cannot allocate vector of size 125382 Kb', while there > still about 17Go free). >There is an upper limit on the memory size because some internal objects in the memory manager are stored as ints (even on a 64-bit system this limits you, I think, to 4Gb, but it may be 2Gb). It shouldn't be too hard to expand these limits for 64-bit systems. There is a much firmer limit in the design of R to no more than 2^31 objects, and to objects of maximum length 2^31, but this would still allow multigigabyte workspaces. -thomas -.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.- r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html Send "info", "help", or "[un]subscribe" (in the "body", not the subject !) To: r-help-request at stat.math.ethz.ch _._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._