Hi,
On Wed, Mar 17, 2010 at 8:14 AM, Shyamasree Saha [shs] <shs at aber.ac.uk>
wrote:> Dear List,
>
> I am trying to read some files using read.csv and total size of those files
are 3.99 GB. I am using MacBook Pro with 4GB RAM(snow leopard). I also tried to
run a chunk from those files and altogether the size was 1.33 GB. But every time
I was getting the following error
>
> R(1200) malloc: *** mmap(size=16777216) failed (error code=12)
> *** error: can't allocate region
> *** set a breakpoint in malloc_error_break to debug
>
> I managed to run smaller chunks (400MB) and I also saved the object. But
problem is when I try to load some of those objects(which is not more than 1.5
GB altogether), I get same error again. Can someone please help? why am I
getting this error? does R need more space than the actual file size? I may buy
new machine if its something related to RAM size. but if it is some R problem
then it will be very useless to buy new machine. So, I really need to know why
this is happening.
Most like it's a "RAM problem," as you say.
I believe read.csv has better RAM characteristics if you set its
"colClasses" parameter ... so if you know that some columns in your
data are numeric, instead of strings, then tell it.
If I'm not mistaken, read.csv will assume everything is a string
first, and store it like that.
-steve
--
Steve Lianoglou
Graduate Student: Computational Systems Biology
| Memorial Sloan-Kettering Cancer Center
| Weill Medical College of Cornell University
Contact Info: http://cbio.mskcc.org/~lianos/contact