Hello Everyone, We have recently purchased a server which has 64GB of memory running a 64bit OS and I have compiled R from source with the following config ./configure --prefix=/usr/local/R-2.9.1 --enable-Rshlib --enable-BLAS-shlib --enable-shared --with-readline --with-iconv --with-x --with-tcktk --with-aqua --with-libpng --with-jpeglib and I would like to verify that I can use 55GB-60GB of the 64GB of memory within R. Does anyone know how this is possible? Will R be able to access that amount of memory from a single process? I am not an R user myself but I just wanted to test this before I turned the server over to the researchers.. Thanks! -scz
On 7/6/2009 3:52 PM, Scott Zentz wrote:> Hello Everyone, > > We have recently purchased a server which has 64GB of memory running > a 64bit OS and I have compiled R from source with the following config > > ./configure --prefix=/usr/local/R-2.9.1 --enable-Rshlib > --enable-BLAS-shlib --enable-shared --with-readline --with-iconv > --with-x --with-tcktk --with-aqua --with-libpng --with-jpeglib > > and I would like to verify that I can use 55GB-60GB of the 64GB of > memory within R. Does anyone know how this is possible? Will R be able > to access that amount of memory from a single process? I am not an R > user myself but I just wanted to test this before I turned the server > over to the researchers..Individual vectors are limited to 2^31-1 elements, and the elements are 8 bytes each in a double precision vector. So executing a <- numeric(2^30) will use up 8 GB of memory. You can try this with other variable names, and see how often it succeeds: b <- numeric(2^30) # total now 16 GB c <- numeric(2^30) # total now 24 GB, etc. Duncan Murdoch
Scott Zentz wrote:> Hello Everyone, > > We have recently purchased a server which has 64GB of memory running > a 64bit OS and I have compiled R from source with the following config > > ./configure --prefix=/usr/local/R-2.9.1 --enable-Rshlib > --enable-BLAS-shlib --enable-shared --with-readline --with-iconv > --with-x --with-tcktk --with-aqua --with-libpng --with-jpeglib > > and I would like to verify that I can use 55GB-60GB of the 64GB of > memory within R. Does anyone know how this is possible? Will R be able > to access that amount of memory from a single process? I am not an R > user myself but I just wanted to test this before I turned the server > over to the researchers..Hmm, it's slightly tricky because R often duplicates objects, so you may hit the limit only transiently. Also, R has an internal 2GB limit on single vectors. But something like this Y <- replicate(30, rnorm(2^28-1)) should create an object of about 30*2GB. Then lapply(Y, mean) should generate 30 very good and very expensive approximations to 0. (For obvious reasons, I haven't tested this on a 1GB ThinkPad X40....) -- O__ ---- Peter Dalgaard ?ster Farimagsgade 5, Entr.B c/ /'_ --- Dept. of Biostatistics PO Box 2099, 1014 Cph. K (*) \(*) -- University of Copenhagen Denmark Ph: (+45) 35327918 ~~~~~~~~~~ - (p.dalgaard at biostat.ku.dk) FAX: (+45) 35327907
You could probably just make a big array and watch "top" usage -- a 5gb array would do the trick -- if you can break 4gb you are golden. big_vector=c(1:1000000) and keep adding zeroes... --j Scott Zentz wrote:> Hello Everyone, > > We have recently purchased a server which has 64GB of memory > running a 64bit OS and I have compiled R from source with the > following config > > ./configure --prefix=/usr/local/R-2.9.1 --enable-Rshlib > --enable-BLAS-shlib --enable-shared --with-readline --with-iconv > --with-x --with-tcktk --with-aqua --with-libpng --with-jpeglib > > and I would like to verify that I can use 55GB-60GB of the 64GB of > memory within R. Does anyone know how this is possible? Will R be able > to access that amount of memory from a single process? I am not an R > user myself but I just wanted to test this before I turned the server > over to the researchers.. > > Thanks! > -scz > > ______________________________________________ > R-help at r-project.org mailing list > https://stat.ethz.ch/mailman/listinfo/r-help > PLEASE do read the posting guide > http://www.R-project.org/posting-guide.html > and provide commented, minimal, self-contained, reproducible code.-- Jonathan A. Greenberg, PhD Postdoctoral Scholar Center for Spatial Technologies and Remote Sensing (CSTARS) University of California, Davis One Shields Avenue The Barn, Room 250N Davis, CA 95616 Cell: 415-794-5043 AIM: jgrn307, MSN: jgrn307 at hotmail.com, Gchat: jgrn307
On Jul 6, 2009, at 4:42 PM, Jonathan Greenberg wrote:> You could probably just make a big array and watch "top" usage -- a > 5gb array would do the trick -- if you can break 4gb you are golden. > big_vector=c(1:1000000) and keep adding zeroes... >Except the maximum size for a vector (and I wonder also for an array?) is 2 GB. ?"Memory-limits" On a 10GB equipped machine (MacOSX with the 64 bit R 2.9.1) I get this > big_vector=c(1:2500000000) Error in 1:2.5e+09 : result would be too long a vector And that is not because of lack of machine resources. You will need to create a sizeable number (say 25) of 2 GB "big_vectors" to carry our this suggestion. -- DW --j> > > Scott Zentz wrote: >> Hello Everyone, >> >> We have recently purchased a server which has 64GB of memory >> running a 64bit OS and I have compiled R from source with the >> following config >> >> ./configure --prefix=/usr/local/R-2.9.1 --enable-Rshlib --enable- >> BLAS-shlib --enable-shared --with-readline --with-iconv --with-x -- >> with-tcktk --with-aqua --with-libpng --with-jpeglib >> >> and I would like to verify that I can use 55GB-60GB of the 64GB of >> memory within R. Does anyone know how this is possible? Will R be >> able to access that amount of memory from a single process? I am >> not an R user myself but I just wanted to test this before I turned >> the server over to the researchers.. >> >> Thanks! >> -scz > > Jonathan A. Greenberg, PhD > Postdoctoral Scholar > Center for Spatial Technologies and Remote Sensing (CSTARS) > University of California, Davis > One Shields Avenue > The Barn, Room 250N > Davis, CA 95616 > Cell: 415-794-5043 > AIM: jgrn307, MSN: jgrn307 at hotmail.com, Gchat: jgrn307 > > ______________________________________________ > R-help at r-project.org mailing list > https://stat.ethz.ch/mailman/listinfo/r-help > PLEASE do read the posting guide http://www.R-project.org/posting-guide.html > and provide commented, minimal, self-contained, reproducible code.David Winsemius, MD Heritage Laboratories West Hartford, CT
On Jul 6, 2009, at 5:01 PM, David Winsemius wrote:> > On Jul 6, 2009, at 4:42 PM, Jonathan Greenberg wrote: > >> You could probably just make a big array and watch "top" usage -- a >> 5gb array would do the trick -- if you can break 4gb you are golden. >> big_vector=c(1:1000000) and keep adding zeroes... >> > > Except the maximum size for a vector (and I wonder also for an > array?) is 2 GB. > > ?"Memory-limits" > > On a 10GB equipped machine (MacOSX with the 64 bit R 2.9.1) I get > this > > big_vector=c(1:2500000000) > Error in 1:2.5e+09 : result would be too long a vector > > And that is not because of lack of machine resources. You will need > to create a sizeable number (say 25) of 2 GB "big_vectors" to carry > our this suggestion.Maybe not as many as I thought. I just remembered that each 2 GB length vector will take up 8*length= 8 GB of RAM, so about 7 of them might do it. > ccc <-character(1000000) > object.size(ccc) 8000088 bytes > ccc <-character(4000000) > object.size(ccc) 32000088 bytes Takes several minutes just to create just one 1.25 GB vector on my machine. > M <- matrix(1:(5000000000/4), ncol=1) Not sure that the results of gc() done immediately after that assignment can be taken at face value. The "max used" column sed in particular appears to bear little relation to my local reality. > gc() used (Mb) gc trigger (Mb) max used (Mb) Ncells 1164830 62.3 1835812 98.1 1835812 98.1 Vcells 625862120 4775.0 1969900386 15029.2 1875862134 14311.7 -- DW> > -- > DW > > --j >> >> >> Scott Zentz wrote: >>> Hello Everyone, >>> >>> We have recently purchased a server which has 64GB of memory >>> running a 64bit OS and I have compiled R from source with the >>> following config >>> >>> ./configure --prefix=/usr/local/R-2.9.1 --enable-Rshlib --enable- >>> BLAS-shlib --enable-shared --with-readline --with-iconv --with-x -- >>> with-tcktk --with-aqua --with-libpng --with-jpeglib >>> >>> and I would like to verify that I can use 55GB-60GB of the 64GB of >>> memory within R. Does anyone know how this is possible? Will R be >>> able to access that amount of memory from a single process? I am >>> not an R user myself but I just wanted to test this before I >>> turned the server over to the researchers.. >>> >>> Thanks! >>> -scz >> >> Jonathan A. Greenberg, PhD >> Postdoctoral Scholar >> Center for Spatial Technologies and Remote Sensing (CSTARS) >> University of California, Davis >> One Shields Avenue >> The Barn, Room 250N >> Davis, CA 95616 >> Cell: 415-794-5043 >> AIM: jgrn307, MSN: jgrn307 at hotmail.com, Gchat: jgrn307 >> >> ______________________________________________ >> R-help at r-project.org mailing list >> https://stat.ethz.ch/mailman/listinfo/r-help >> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html >> and provide commented, minimal, self-contained, reproducible code. > > David Winsemius, MD > Heritage Laboratories > West Hartford, CT > > ______________________________________________ > R-help at r-project.org mailing list > https://stat.ethz.ch/mailman/listinfo/r-help > PLEASE do read the posting guide http://www.R-project.org/posting-guide.html > and provide commented, minimal, self-contained, reproducible code.David Winsemius, MD Heritage Laboratories West Hartford, CT
check Memory in R: ?Memory --- On Mon, 7/6/09, Scott Zentz <zentz@email.unc.edu> wrote: From: Scott Zentz <zentz@email.unc.edu> Subject: [R] Testing memory limits in R?? To: r-help@r-project.org Date: Monday, July 6, 2009, 3:52 PM Hello Everyone, We have recently purchased a server which has 64GB of memory running a 64bit OS and I have compiled R from source with the following config ./configure --prefix=/usr/local/R-2.9.1 --enable-Rshlib --enable-BLAS-shlib --enable-shared --with-readline --with-iconv --with-x --with-tcktk --with-aqua --with-libpng --with-jpeglib and I would like to verify that I can use 55GB-60GB of the 64GB of memory within R. Does anyone know how this is possible? Will R be able to access that amount of memory from a single process? I am not an R user myself but I just wanted to test this before I turned the server over to the researchers.. Thanks! -scz ______________________________________________ R-help@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide http://www.R-project.org/posting-guide.html and provide commented, minimal, self-contained, reproducible code. [[alternative HTML version deleted]]
Hello Everyone! Thanks for all your replies! This was very helpful! I found that there seems to be a limitation to only 32GB of memory which I think will be fine. I was able to consume the 32GB of memory with the following: Start R with the following command: R --max-vsize 55000M then within R run x=rep(0.1,2.141e9) and watch the process with top and R will consume about 32GB of memory... Hopefully this will be enough for the researchers ;) Thanks! -scz Scott Zentz wrote:> Hello Everyone, > > We have recently purchased a server which has 64GB of memory > running a 64bit OS and I have compiled R from source with the > following config > > ./configure --prefix=/usr/local/R-2.9.1 --enable-Rshlib > --enable-BLAS-shlib --enable-shared --with-readline --with-iconv > --with-x --with-tcktk --with-aqua --with-libpng --with-jpeglib > > and I would like to verify that I can use 55GB-60GB of the 64GB of > memory within R. Does anyone know how this is possible? Will R be able > to access that amount of memory from a single process? I am not an R > user myself but I just wanted to test this before I turned the server > over to the researchers.. > > Thanks! > -scz > > ______________________________________________ > R-help at r-project.org mailing list > https://stat.ethz.ch/mailman/listinfo/r-help > PLEASE do read the posting guide > http://www.R-project.org/posting-guide.html > and provide commented, minimal, self-contained, reproducible code.