Ken Termiso
2005-Dec-14 20:47 UTC
[R] memory tops out at 1.84gb on OS X 10.4 machine w/ 5GB ram
Hi all, Sorry if this is a dumb question, but I am on 10.4 with R2.2, and when loading a big text file (~500MB) with scan(file, what=character) I am throwing malloc errors that say I am out of memory...I have 5GB on this machine, and Activity Monitor tells me R is only up to ~1.84GB both times this has happened (running from terminal)... I am wondering why this is happening when I still have >2GB of free memory waiting to be used...? Any advice would be much obliged, Ken
David Ruau
2005-Dec-15 09:40 UTC
[R] memory tops out at 1.84gb on OS X 10.4 machine w/ 5GB ram
Hi, I don't know why, but I have a workaround maybe: You can load sequentially the file. Split the text file in 2 or 3 and re-associate the vector/list into r after. Once I was using a similar technic to write a huge matrix into a txt file. David On Dec 14, 2005, at 21:47, Ken Termiso wrote:> Hi all, > > Sorry if this is a dumb question, but I am on 10.4 with R2.2, and when > loading a big text file (~500MB) with scan(file, what=character) I am > throwing malloc errors that say I am out of memory...I have 5GB on this > machine, and Activity Monitor tells me R is only up to ~1.84GB both > times > this has happened (running from terminal)... > > I am wondering why this is happening when I still have >2GB of free > memory > waiting to be used...? > > Any advice would be much obliged, > Ken > > ______________________________________________ > R-help at stat.math.ethz.ch mailing list > https://stat.ethz.ch/mailman/listinfo/r-help > PLEASE do read the posting guide! > http://www.R-project.org/posting-guide.html >
Roger D. Peng
2005-Dec-15 13:17 UTC
[R] memory tops out at 1.84gb on OS X 10.4 machine w/ 5GB ram
I'm not completely sure, but I don't think OS X is at the point yet where it can access > 2GB of memory (like, for example, Linux on Opteron). More specifically, I'm not sure a single process image can access > 2GB of memory, but I'd welcome any corrections to that statement. To be sure, this problem is not an issue with R because R has regularly been reported to access u> 4GB of memory when the OS allows it. -roger Ken Termiso wrote:> Hi all, > > Sorry if this is a dumb question, but I am on 10.4 with R2.2, and when > loading a big text file (~500MB) with scan(file, what=character) I am > throwing malloc errors that say I am out of memory...I have 5GB on this > machine, and Activity Monitor tells me R is only up to ~1.84GB both times > this has happened (running from terminal)... > > I am wondering why this is happening when I still have >2GB of free memory > waiting to be used...? > > Any advice would be much obliged, > Ken > > ______________________________________________ > R-help at stat.math.ethz.ch mailing list > https://stat.ethz.ch/mailman/listinfo/r-help > PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html >-- Roger D. Peng | http://www.biostat.jhsph.edu/~rpeng/