Hi all Two computers: one is my desktop PC, windows2000, R 1.9.1. Physical RAM 256MB, Swap (Virtual memory) 384Mb. When I allocate a large matrix, it firstly uses up RAM, then use swap space. In windows' task manager, the usage of memory could exceed my physic RAM's size. The other machine is a remote server. Windows XP, R 1.9.1 Physical RAM 2GB. Swap space 4GB. I use "R --max-mem-size=4000M" to start R. However when I allocate a large matrix or data frame, it uses up all RAM then exits with a error message" cannot allocate vector of size 7812 Kb ". The Swap space is not used at all ! What's more, I found that the read.table() function is really a waste of memory.> ft <- read.table("filepath") > object.size(ft)object.size(ft) [1] 192000692 only 192Mb. however, in the windows task manager it shows that this process takes nearly 800Mb memory. I used gc() to collect garbarge. Howerver it doesn't help. Any guys have methods to release the wasted memory? thank you all. Regards
> From: Hu Chen > > Hi all > Two computers: > one is my desktop PC, windows2000, R 1.9.1. Physical RAM 256MB, Swap > (Virtual memory) 384Mb. When I allocate a large matrix, it firstly > uses up RAM, then use swap space. In windows' task manager, the usage > of memory could exceed my physic RAM's size. > The other machine is a remote server. Windows XP, R 1.9.1 > Physical RAM 2GB. > Swap space 4GB. I use "R --max-mem-size=4000M" to start R. However > when I allocate a large matrix or data frame, it uses up all RAM then > exits with a error message" cannot allocate vector of size 7812 Kb ". > The Swap space is not used at all !Please do read http://cran.r-project.org/bin/windows/base/rw-FAQ.html#There-seems-to-be-a-l imit-on-the-memory-it-uses_0021.> What's more, I found that the read.table() function is really > a waste of memory. > > ft <- read.table("filepath") > > object.size(ft) > object.size(ft) > [1] 192000692 > only 192Mb. > however, in the windows task manager it shows that this process takes > nearly 800Mb memory. > I used gc() to collect garbarge. Howerver it doesn't help. > Any guys have methods to release the wasted memory?This has been asked and answered on R-help many times (good candidate to add to the FAQ?). As an example, see: http://tolstoy.newcastle.edu.au/R/help/04/06/1662.html Andy> thank you all. > Regards > > ______________________________________________ > R-help at stat.math.ethz.ch mailing list > https://stat.ethz.ch/mailman/listinfo/r-help > PLEASE do read the posting guide! > http://www.R-project.org/posting-guide.html > >
Marcus Davy
2004-Dec-09 21:00 UTC
[R] a question about swap space, memory and read.table()
On a 32-bit windows (standard install or R) machine you cannot allocate more than 2Gigs of memory, "R --max-mem-size=2G" or "R --max-mem-size=2000M" If you try to allocate more than 2G (eg 4000 Meg) I suspect it then defaults back to 1Gig. Check you memory limit with memory.limit(). There is lots of information about configuration in the windows FAQ, http://cran.r-project.org/bin/windows/base/rw-FAQ.html#There-seems-to-be-a-limit-on-the-memory-it-uses_0021 If you want 3Gigs per process you are going to have to modify the R executable to make it /LARGEADDRESSAWARE marcus>>> Hu Chen <chencheva at gmail.com> 10/12/2004 3:39:18 AM >>>Hi all Two computers: one is my desktop PC, windows2000, R 1.9.1. Physical RAM 256MB, Swap (Virtual memory) 384Mb. When I allocate a large matrix, it firstly uses up RAM, then use swap space. In windows' task manager, the usage of memory could exceed my physic RAM's size. The other machine is a remote server. Windows XP, R 1.9.1 Physical RAM 2GB. Swap space 4GB. I use "R --max-mem-size=4000M" to start R. However when I allocate a large matrix or data frame, it uses up all RAM then exits with a error message" cannot allocate vector of size 7812 Kb ". The Swap space is not used at all ! What's more, I found that the read.table() function is really a waste of memory.> ft <- read.table("filepath") > object.size(ft)object.size(ft) [1] 192000692 only 192Mb. however, in the windows task manager it shows that this process takes nearly 800Mb memory. I used gc() to collect garbarge. Howerver it doesn't help. Any guys have methods to release the wasted memory? thank you all. Regards ______________________________________________ R-help at stat.math.ethz.ch mailing list https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html ______________________________________________________ The contents of this e-mail are privileged and/or confidenti...{{dropped}}