On Fri, 31 Aug 2001, Michael Roberts wrote:
> Hello R users,
>
> I am doing some work with a large data set but am running out
> of memory. I understand that R dynamically loads memory as
> it needs it, but for some reason it cannot seem to find my virtual
> memory. It's limit is about 256M (my hard memory) regardless
> of the amount of virtual memory. I'm running R 1.30 on Windows
> NT.
>
> Am I missing an easy fix here?
See the rw-FAQ, question
2.4 There seems to be a limit on the memory it uses!
Also, you will get a message when it runs out saying
Reached total allocation of 255Mb: see help(memory.size)
and you could always as a last resort do as it says ....
I should say that on Windows 9x systems you really don't want to use your
virtual memory, and for NT/2000 you will see a large performance hit.
--
Brian D. Ripley, ripley at stats.ox.ac.uk
Professor of Applied Statistics, http://www.stats.ox.ac.uk/~ripley/
University of Oxford, Tel: +44 1865 272861 (self)
1 South Parks Road, +44 1865 272860 (secr)
Oxford OX1 3TG, UK Fax: +44 1865 272595
-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-
r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html
Send "info", "help", or "[un]subscribe"
(in the "body", not the subject !) To: r-help-request at
stat.math.ethz.ch
_._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._