Adam Wilson wrote:> Greetings all,
>
> First of all, thanks to all of you for creating such a useful, powerful
> program.
>
> I regularly work with very large datasets (several GB) in R in 64-bit
Fedora
> 8 (details below). I'm lucky to have 16GB RAM available. However if I
am
> not careful and load too much into R's memory, I can crash the whole
> system. There does not seem to be a check in place that will stop R from
> trying to allocate all available memory (including swap space). I have
> system status plots in my task bar, which I can watch to see when all the
> ram is taken and R then reserves all the swap space. If I don't kill
the R
> process before the swap hits 100%, it will freeze the machine. I don't
know
> if this is an R problem or a Fedora problem (I suppose the kernal should be
> killing R before it crashes, but shouldn't R stop before it takes all
the
> memory?).
>
> To replicate this behavior, I can crash the system by allocating more and
> more memory in R:
> v1=matrix(nrow=1e5,ncol=1e4)
> v2=matrix(nrow=1e5,ncol=1e4)
> v3=matrix(nrow=1e5,ncol=1e4)
> v4=matrix(nrow=1e5,ncol=1e4)
>
> etc. until R claims all RAM and swap space, and crashes the machine. If I
> try this on a windows machine eventually the allocation fails with an
error
> in R, " Error: cannot allocate vector of size XX MB". This is
much
> preferable to crashing the whole system. Why doesn't this happen in
Linux?
>
> Is there some setting that will prevent this? I've looked though the
> archives and not found a similar problem.
>
> Thanks for any help.
>
> Adam
R won't know that it is running out of memory until the system refuses
to allocate any more. However, this is not as clear-cut on Unix/Linux as
on other OS's. Because some programs allocate a lot of memory without
actually using it, the kernel may allow more memory to be allocated than
is actually in the system, and then kill off processes if it finds
itself tied in a knot. (Google "linux oom" for more details.)
You can however, put a hard limit on R's process size with the ulimit
shell command before starting R (say, "ulimit -v 16000000" for a limit
of 16G virtual memory). Notice if you play with this, that you can only
lower the limits not raise them.
--
O__ ---- Peter Dalgaard ?ster Farimagsgade 5, Entr.B
c/ /'_ --- Dept. of Biostatistics PO Box 2099, 1014 Cph. K
(*) \(*) -- University of Copenhagen Denmark Ph: (+45) 35327918
~~~~~~~~~~ - (p.dalgaard at biostat.ku.dk) FAX: (+45) 35327907