On Tue, 8 May 2007, Victor Gravenholt wrote:
> As a part of a simulation, I need to sample from a large vector repeatedly.
> For some reason sample() builds up the memory usage (> 500 MB for this
> example) when used inside a for loop as illustrated here:
>
> X <- 1:100000
> P <- runif(100000)
> for(i in 1:500) Xsamp <- sample(X,30000,replace=TRUE,prob=P)
>
> Even worse, I am not able to free up memory without quitting R.
> I quickly run out of memory when trying to perform the simulation. Is
> there any way to avoid this to happen?
>
> The problem seem to appear only when specifying both replace=TRUE and
> probability weights for the vector being sampled, and this happens both
> on Windows XP and Linux (Ubuntu).
And for 10000 < size <= 100000. There was a typo causing memory not to be
freed in that range. It is now fixed in 2.5.0 patched.
--
Brian D. Ripley, ripley at stats.ox.ac.uk
Professor of Applied Statistics, http://www.stats.ox.ac.uk/~ripley/
University of Oxford, Tel: +44 1865 272861 (self)
1 South Parks Road, +44 1865 272866 (PA)
Oxford OX1 3TG, UK Fax: +44 1865 272595