Xebar Saram
2014-Jan-02 07:07 UTC
[R] R crashes with memory errors on a 256GB machine (and system shoes only 60GB usage)
Hi All, I have a terrible issue i cant seem to debug which is halting my work completely. I have R 3.02 installed on a linux machine (arch linux-latest) which I built specifically for running high memory use models. the system is a 16 core, 256 GB RAM machine. it worked well at the start but in the recent days i keep getting errors and crashes regarding memory use, such as "cannot create vector size of XXX, not enough memory" etc when looking at top (linux system monitor) i see i barley scrape the 60 GB of ram (out of 256GB) i really don't know how to debug this and my whole work is halted due to this so any help would be greatly appreciated Best wishes Z [[alternative HTML version deleted]]
Ben Bolker
2014-Jan-02 20:35 UTC
[R] R crashes with memory errors on a 256GB machine (and system shoes only 60GB usage)
Xebar Saram <zeltakc <at> gmail.com> writes:> > Hi All, > > I have a terrible issue i cant seem to debug which is halting my work > completely. I have R 3.02 installed on a linux machine (arch linux-latest) > which I built specifically for running high memory use models. the system > is a 16 core, 256 GB RAM machine. it worked well at the start but in the > recent days i keep getting errors and crashes regarding memory use, such as > "cannot create vector size of XXX, not enough memory" etc > > when looking at top (linux system monitor) i see i barley scrape the 60 GB > of ram (out of 256GB) > > i really don't know how to debug this and my whole work is halted due to > this so any help would be greatly appreciatedI'm very sympathetic, but it will be almost impossible to debug this sort of a problem remotely, without a reproducible example. The only guess that I can make, if you *really* are running *exactly* the same code as you previously ran successfully, is that you might have some very large objects hidden away in a saved workspace in a .RData file that's being loaded automatically ... I would check whether gc(), memory.profile(), etc. give sensible results in a clean R session (R --vanilla). Ben Bolker
Milan Bouchet-Valat
2014-Jan-02 22:16 UTC
[R] R crashes with memory errors on a 256GB machine (and system shoes only 60GB usage)
Le jeudi 02 janvier 2014 ? 09:07 +0200, Xebar Saram a ?crit :> Hi All, > > I have a terrible issue i cant seem to debug which is halting my work > completely. I have R 3.02 installed on a linux machine (arch linux-latest) > which I built specifically for running high memory use models. the system > is a 16 core, 256 GB RAM machine. it worked well at the start but in the > recent days i keep getting errors and crashes regarding memory use, such as > "cannot create vector size of XXX, not enough memory" etc > > when looking at top (linux system monitor) i see i barley scrape the 60 GB > of ram (out of 256GB) > > i really don't know how to debug this and my whole work is halted due to > this so any help would be greatly appreciatedOne important thing to note is that while the memory use may appear to be low, if the memory is fragmented, R may not be able to allocate a *contiguous* memory area for a big vector (you didn't tell us how big it was). In that case, AFAIK the only solution is to restart R (saving the session or objects you want to keep). Regards