James Muller
2005-Jan-28 06:12 UTC
[R] Error: cannot allocate vector of size... but with a twist
Hi, I have a memory problem, one which I've seen pop up in the list a few times, but which seems to be a little different. It is the Error: cannot allocate vector of size x problem. I'm running R2.0 on RH9. My R program is joining big datasets together, so there are lots of duplicate cases of data in memory. This (and other tasks) prompted me to... expand... my swap partition to 16Gb. I have 0.5Gb of regular, fast DDR. The OS seems to be fine accepting the large amount of memory, and I'm not restricting memory use or vector size in any way. R chews up memory up until the 3.5Gb area, then halts. Here's the last bit of output: > # join the data together > cdata01.data <- cbind(c.1,c.2,c.3,c.4,c.5,c.6,c.7,c.8,c.9,c.10,c.11,c.12,c.13,c.14,c.15,c.16,c.17,c.18,c.19,c.20,c.21,c.22,c.23,c.24,c.25,c.26,c.27,c.28,c.29,c.30,c.31,c.32,c.33) Error: cannot allocate vector of size 145 Kb Execution halted 145--Kb---?? This has me rather lost. Maybe on overflow of some sort?? Maybe on OS problem of some sort? I'm scratching here. Before you question it, there is a legitimate reason for sticking all these components in the one data.frame. One of the problems here is that tinkering is not really feasible. This cbind took 1.5 hrs to finally halt. Any help greatly appreciated, James
Prof Brian Ripley
2005-Jan-28 06:59 UTC
[R] Error: cannot allocate vector of size... but with a twist
On Fri, 28 Jan 2005, James Muller wrote:> Hi, > > I have a memory problem, one which I've seen pop up in the list a few times, > but which seems to be a little different. It is the Error: cannot allocate > vector of size x problem. I'm running R2.0 on RH9. > > My R program is joining big datasets together, so there are lots of duplicate > cases of data in memory. This (and other tasks) prompted me to... expand... > my swap partition to 16Gb. I have 0.5Gb of regular, fast DDR. The OS seems to > be fine accepting the large amount of memory, and I'm not restricting memory > use or vector size in any way. > > R chews up memory up until the 3.5Gb area, then halts. Here's the last bit of > output:You have, presumably, a 32-bit computer with a 4GB-per-process memory limit. You have hit it (you get less than 4Gb as the OS services need some and there is some fragmentation). The last failed allocation may be small, as you see, if you are allocating lots of smallish pieces. The only way to overcome that is to use a 64-bit OS and version of R. What was the `twist' mentioned in the title? You will find a similar overall limit mentioned about weekly on this list if you look in the archives.> >> # join the data together >> cdata01.data <- > cbind(c.1,c.2,c.3,c.4,c.5,c.6,c.7,c.8,c.9,c.10,c.11,c.12,c.13,c.14,c.15,c.16,c.17,c.18,c.19,c.20,c.21,c.22,c.23,c.24,c.25,c.26,c.27,c.28,c.29,c.30,c.31,c.32,c.33) > Error: cannot allocate vector of size 145 Kb > Execution halted > > 145--Kb---?? This has me rather lost. Maybe on overflow of some sort?? Maybe > on OS problem of some sort? I'm scratching here. > > Before you question it, there is a legitimate reason for sticking all these > components in the one data.frame. > > One of the problems here is that tinkering is not really feasible. This cbind > took 1.5 hrs to finally halt. > > Any help greatly appreciated, > > James > > ______________________________________________ > R-help at stat.math.ethz.ch mailing list > https://stat.ethz.ch/mailman/listinfo/r-help > PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html >-- Brian D. Ripley, ripley at stats.ox.ac.uk Professor of Applied Statistics, http://www.stats.ox.ac.uk/~ripley/ University of Oxford, Tel: +44 1865 272861 (self) 1 South Parks Road, +44 1865 272866 (PA) Oxford OX1 3TG, UK Fax: +44 1865 272595
Paul Roebuck
2005-Jan-28 07:05 UTC
[R] Error: cannot allocate vector of size... but with a twist
On Fri, 28 Jan 2005, James Muller wrote:> I have a memory problem, one which I've seen pop up in the list a few > times, but which seems to be a little different. It is the Error: cannot > allocate vector of size x problem. I'm running R2.0 on RH9. > > [SNIP] > > R chews up memory up until the 3.5Gb area, then halts. Here's the last > bit of output: >32-bit addressing goes to ~4Gb. ---------------------------------------------------------- SIGSIG -- signature too long (core dumped)
Liaw, Andy
2005-Jan-28 13:44 UTC
[R] Error: cannot allocate vector of size... but with a twist
Just a couple of remarks below...> From: James Muller > > Hi, > > I have a memory problem, one which I've seen pop up in the list a few > times, but which seems to be a little different. It is the > Error: cannot > allocate vector of size x problem. I'm running R2.0 on RH9. > > My R program is joining big datasets together, so there are lots of > duplicate cases of data in memory. This (and other tasks) prompted me > to... expand... my swap partition to 16Gb. I have 0.5Gb of > regular, fast > DDR. The OS seems to be fine accepting the large amount of > memory, and > I'm not restricting memory use or vector size in any way. > > R chews up memory up until the 3.5Gb area, then halts. Here's > the last > bit of output: > > > # join the data together > > cdata01.data <- > cbind(c.1,c.2,c.3,c.4,c.5,c.6,c.7,c.8,c.9,c.10,c.11,c.12,c.13, > c.14,c.15,c.16,c.17,c.18,c.19,c.20,c.21,c.22,c.23,c.24,c.25,c. > 26,c.27,c.28,c.29,c.30,c.31,c.32,c.33) > Error: cannot allocate vector of size 145 Kb > Execution halted > > 145--Kb---?? This has me rather lost. Maybe on overflow of > some sort?? > Maybe on OS problem of some sort? I'm scratching here. > > Before you question it, there is a legitimate reason for sticking all > these components in the one data.frame.One possible way to get around this, if you really have no alternatives, is to write the individual columns (I assume that's what those things you're cbind()ing are) to files, and use `paste' to paste them into one file, and read that into a fresh R session.> One of the problems here is that tinkering is not really > feasible. This > cbind took 1.5 hrs to finally halt.That's the price you pay for using your HDD as memory! Andy> Any help greatly appreciated, > > James