On Thu, 15 Nov 2001, Geert Aarts wrote:
> I’m trying to make an estimation of sea-depth for 2000 animal locations
> using 2 million depth soundings. The result was:
> Cannot allocate vector of size 20351 Kb
>
> I've tried to overcome this problem by making the following
adaptations.
> -Increased the max-mem-size to 1.5Gb (Thanks to Thomas Lumley for his help)
> -Use a max.dist (10km) for the function Krige
> -Run the Krige computation for each separate animal location while using a
> repeat loop
> -Select only those sample locations and data (depth soundings) within a
20km
> by 20km box around the animal location (So the Krige function does not need
> to calculate the distance between the animal location and all the 2 million
> sample locations)
>
> However the error: cannot allocate vector of 24432 Kb still remains. For
> that particular computation only 1500 sample locations and data where used.
> After that computation my memory.size (TRUE) is 1,654,243,233 and
> memory.size() is 654,345,998.
Well, you must be using Windows then, but please tell us.
> Does this mean that this (small) computation requires 1 Gb?
No, it means what you did in that session did.
> Below I outline how I heard, understood and think the memory problem is
> explained
> During one computation several vectors can be created (fragmentation).
> During this computation some vectors might be removed. These objects and
> there associated memory are put in the garbage collection. Although the
> object is removed, the memory.size() is unchanged. After the computation or
> during following computation the memory in the garbage collection is
> deallocated.
> Is this correct?
memory.size() is about the memory allocator, not R's memory usage: use
gc() for the latter. On Windows (ONLY) when you set --max-mem-size above
256Mb there is a danger of fragmentation in the memory allocator (not in
R's usage) and the memory allocated may never be returned to the OS.
It is still available for use, though.
That Windows-specific snag is removed in the R-devel code base.
> And if so, does this mean that the memory is piled up during a computation
> and also during a repeat loop?
It doesn't mean that. Garbage collection occurs whenever no free workspace
can be found, potentially at any R operation that allocated memory.
> I hope somebody can help me with this.
I suspect that whatever you are doing actually needs large amounts of
memory. Windows is not good at that. Also, none of the developers use R
under Windows for large tasks, so R under Windows may not be as good as it
could be at this. Windows users need to remember that R is a volunteer
project, which has received very little contribution of Windows expertise.
--
Brian D. Ripley, ripley at stats.ox.ac.uk
Professor of Applied Statistics, http://www.stats.ox.ac.uk/~ripley/
University of Oxford, Tel: +44 1865 272861 (self)
1 South Parks Road, +44 1865 272860 (secr)
Oxford OX1 3TG, UK Fax: +44 1865 272595
-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-
r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html
Send "info", "help", or "[un]subscribe"
(in the "body", not the subject !) To: r-help-request at
stat.math.ethz.ch
_._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._