Uwe Ligges
2016-May-05 09:11 UTC
[Rd] R process killed when allocating too large matrix (Mac OS X)
On 05.05.2016 04:25, Marius Hofert wrote:> Hi Simon, > > ... all interesting (but quite a bit above my head). I only read > 'Linux' and want to throw in that this problem does not appear on > Linux (it seems). I talked about this with Martin Maechler and he > reported that the same example (on one of his machines; with NA_real_ > instead of '0's in the matrix) gave: > > Error: cannot allocate vector of size 70.8 Gb > Timing stopped at: 144.79 41.619 202.019 > > ... but no killer around...Well, with n=1. ;-) Actually this also happens under Linux and I had my R processes killed more than once (and much worse also other processes so that we had to reboot a server, essentially). That's why we use job scheduling on servers for R nowadays ... Best, Uwe> > Cheers, > Marius > > ______________________________________________ > R-devel at r-project.org mailing list > https://stat.ethz.ch/mailman/listinfo/r-devel >
Prof Brian Ripley
2016-May-05 09:39 UTC
[Rd] R process killed when allocating too large matrix (Mac OS X)
On 05/05/2016 10:11, Uwe Ligges wrote:> > > On 05.05.2016 04:25, Marius Hofert wrote: >> Hi Simon, >> >> ... all interesting (but quite a bit above my head). I only read >> 'Linux' and want to throw in that this problem does not appear on >> Linux (it seems). I talked about this with Martin Maechler and he >> reported that the same example (on one of his machines; with NA_real_ >> instead of '0's in the matrix) gave: >> >> Error: cannot allocate vector of size 70.8 Gb >> Timing stopped at: 144.79 41.619 202.019 >> >> ... but no killer around... > > Well, with n=1. ;-) > > Actually this also happens under Linux and I had my R processes killed > more than once (and much worse also other processes so that we had to > reboot a server, essentially). That's why we use job scheduling on > servers for R nowadays ...Yes, Linux does not deal safely with running out of memory, although it is better than it was. In my experience, only commercial Unices do that gracefully. Have you tried setting a (virtual) memory limit on the process using the shell it is launched from? I have found that to be effective on most OSes, at least in protecting other processes from being killed. However, some things do reserve excessive amounts of VM that they do not use and so cannot be run under a sensible limit. -- Brian D. Ripley, ripley at stats.ox.ac.uk Emeritus Professor of Applied Statistics, University of Oxford
Jeroen Ooms
2016-May-09 23:08 UTC
[Rd] R process killed when allocating too large matrix (Mac OS X)
On 05/05/2016 10:11, Uwe Ligges wrote:> Actually this also happens under Linux and I had my R processes killed > more than once (and much worse also other processes so that we had to > reboot a server, essentially).I found that setting RLIMIT_AS [1] works very well on Linux. But this requires that you cap memory to some fixed value.> library(RAppArmor) > rlimit_as(1e9) > rnorm(1e9)Error: cannot allocate vector of size 7.5 Gb The RAppArmor package has many other utilities to protect your server such from a mis-behaving process such as limiting cpu time (RLIMIT_CPU), fork bombs (RLIMIT_NPROC) and file sizes (RLIMIT_FSIZE). [1] http://linux.die.net/man/2/getrlimit
Apparently Analagous Threads
- R process killed when allocating too large matrix (Mac OS X)
- R process killed when allocating too large matrix (Mac OS X)
- R process killed when allocating too large matrix (Mac OS X)
- R Configuration Variable: Maximum Memory Allocation per R Instance
- openssh2.3.0p1 and /etc/limits