Did you read the *rest* of what the rw-FAQ says?
Be aware though that Windows has (in most versions) a maximum amount of
user virtual memory of 2Gb, and parts of this can be reserved by
processes but not used. The version of the memory manager used from R
1.9.0 allocates large objects in their own memory areas and so is better
able to make use of fragmented virtual memory than that used previously.
R can be compiled to use a different memory manager which might be
better at using large amounts of memory, but is substantially slower
(making R several times slower on some tasks).
So, it tells you about memory fragmentation, and it tells you about making
R aware of large-memory versions of Windows and that an alternative memory
manager can be used. If you actually tried those, the posting guide asks
you to indicate it, so I presume you did not.
Also, take seriously the idea of using a more capable operating system
that is better able to manage 2Gb of RAM.
On Tue, 14 Sep 2004, Christoph Lehmann wrote:
> I have (still) some memory problems, when trying to allocate a huge array:
>
> WinXP pro, with 2G RAM
>
> I start R by calling:
>
> Rgui.exe --max-mem-size=2Gb (as pointed out in R for windows FAQ)
>
> R.Version(): i386-pc-mingw32, 9.1, 21.6.2004
>
> ## and here the problem
> x.dim <- 46
> y.dim <- 58
> slices <- 40
> volumes <- 1040
> a <- rep(0, x.dim * y.dim * slices * volumes)
> dim(a) <- c(x.dim, y.dim, slices, volumes)
>
> gives me: "Error: cannot allocate vector of size 850425 Kb"
>
> even though
>
> memory.limit(size = NA)
> yields 2147483648
>
> and
>
> memory.size()
> gives 905838768
>
> so why is that and what can I do against it?
>
> Many thanks for your kind help
>
> Cheers
>
> Christoph
>
>
--
Brian D. Ripley, ripley at stats.ox.ac.uk
Professor of Applied Statistics, http://www.stats.ox.ac.uk/~ripley/
University of Oxford, Tel: +44 1865 272861 (self)
1 South Parks Road, +44 1865 272866 (PA)
Oxford OX1 3TG, UK Fax: +44 1865 272595