Alan.X.Simpson at nab.com.au
2012-Aug-05 22:52 UTC
[R] Memory limit for Windows 64bit build of R
Dear all I have a Windows Server 2008 R2 Enterprise machine, with 64bit R installed running on 2 x Quad-core Intel Xeon 5500 processor with 24GB DDR3 1066 Mhz RAM. I am seeking to analyse very large data sets (perhaps as much as 10GB), without the addtional coding overhead of a package such as bigmemory(). My question is this - if we were to increase the RAM on the machine to (say) 128GB, would this become a possibility? I have read the documentation on memory limits and it seems so, but would like some additional confirmation before investing in any extra RAM. Kind regards Alan Alan Simpson Technical Lead, Retail Model Development Retail Models Project National Australia Bank Level 15, 500 Bourke St, Melbourne VIC Tel: +61 (0) 3 8697 7135 | Mob: +61 (0) 412 975 955 Email: Alan.X.Simpson@nab.com.au The information contained in this email and its attachments may be confidential. If you have received this email in error, please notify the sender by return email, delete this email and destroy any copy. Any advice contained in this email has been prepared without taking into account your objectives, financial situation or needs. Before acting on any advice in this email, National Australia Bank Limited ABN 12 004 044 937 AFSL and Australian Credit Licence 230686 (NAB) recommends that you consider whether it is appropriate for your circumstances. If this email contains reference to any financial products, NAB recommends you consider the Product Disclosure Statement (PDS) or other disclosure document available from NAB, before making any decisions regarding any products. If this email contains any promotional content that you do not wish to receive, please reply to the original sender and write "Don't email promotional material" in the subject. [[alternative HTML version deleted]]
Hi, Before someone gives professional advice, you may do an experiment: Set the windows virtual memeory to be as large as ~128GB, (make sure the hard drive has enough space, restart might be required); increase the memroy limit in R; load a big dataset (or iteratively assign it to an object, and do some calculation.....Definitely will be very slow) I am not sure. Just try to help. Best wishes, Jie On Sun, Aug 5, 2012 at 6:52 PM, <Alan.X.Simpson@nab.com.au> wrote:> Dear all > > I have a Windows Server 2008 R2 Enterprise machine, with 64bit R installed > running on 2 x Quad-core Intel Xeon 5500 processor with 24GB DDR3 1066 Mhz > RAM. I am seeking to analyse very large data sets (perhaps as much as > 10GB), without the addtional coding overhead of a package such as > bigmemory(). > > My question is this - if we were to increase the RAM on the machine to > (say) 128GB, would this become a possibility? I have read the > documentation on memory limits and it seems so, but would like some > additional confirmation before investing in any extra RAM. > > Kind regards > > Alan > > Alan Simpson > Technical Lead, Retail Model Development > Retail Models Project > National Australia Bank > > Level 15, 500 Bourke St, Melbourne VIC > Tel: +61 (0) 3 8697 7135 | Mob: +61 (0) 412 975 955 > Email: Alan.X.Simpson@nab.com.au > > > The information contained in this email and its attachments may be > confidential. > If you have received this email in error, please notify the sender by > return email, > delete this email and destroy any copy. > > Any advice contained in this email has been prepared without taking into > account your objectives, financial situation or needs. Before acting on any > advice in this email, National Australia Bank Limited ABN 12 004 044 937 > AFSL and Australian Credit Licence 230686 (NAB) recommends that > you consider whether it is appropriate for your circumstances. > If this email contains reference to any financial products, NAB recommends > you consider the Product Disclosure Statement (PDS) or other disclosure > document available from NAB, before making any decisions regarding any > products. > > If this email contains any promotional content that you do not wish to > receive, > please reply to the original sender and write "Don't email promotional > material" in the subject. > > > [[alternative HTML version deleted]] > > ______________________________________________ > R-help@r-project.org mailing list > https://stat.ethz.ch/mailman/listinfo/r-help > PLEASE do read the posting guide > http://www.R-project.org/posting-guide.html > and provide commented, minimal, self-contained, reproducible code. >[[alternative HTML version deleted]]
On Aug 5, 2012, at 3:52 PM, Alan.X.Simpson at nab.com.au wrote:> Dear all > > I have a Windows Server 2008 R2 Enterprise machine, with 64bit R > installed > running on 2 x Quad-core Intel Xeon 5500 processor with 24GB DDR3 > 1066 Mhz > RAM. I am seeking to analyse very large data sets (perhaps as much as > 10GB), without the addtional coding overhead of a package such as > bigmemory().It may depend in part on how that number is arrived at. And what you plan on doing with it. (Don't consider creating a dist-object.)> > My question is this - if we were to increase the RAM on the machine to > (say) 128GB, would this become a possibility? I have read the > documentation on memory limits and it seems so, but would like some > additional confirmation before investing in any extra RAM.The trypical advices is you will need memory that is 3 times as large as a large dataset, and I find that even more headroom is needed. I have 32GB and my larger datasets occupy 5-6 GB and I generally have few problems. I had quite a few problems with 18 GB, so I think the ratio should be 4-5 x your 10GB object. I predict you could get by with 64GB. (please send check for half the difference in cost between 64GB abd 128 GB.) -- David.> Kind regards > > Alan > > Alan Simpson > Technical Lead, Retail Model Development > Retail Models Project > National Australia Bank > > Level 15, 500 Bourke St, Melbourne VIC > Tel: +61 (0) 3 8697 7135 | Mob: +61 (0) 412 975 955 > Email: Alan.X.Simpson at nab.com.au > > > The information contained in this email and its attachments may be > confidential. > If you have received this email in error, please notify the sender > by return email, > delete this email and destroy any copy. > > Any advice contained in this email has been prepared without taking > into > account your objectives, financial situation or needs. Before acting > on any > advice in this email, National Australia Bank Limited ABN 12 004 044 > 937 AFSL and Australian Credit Licence 230686 (NAB) recommends that > you consider whether it is appropriate for your circumstances. > If this email contains reference to any financial products, NAB > recommends > you consider the Product Disclosure Statement (PDS) or other > disclosure > document available from NAB, before making any decisions regarding any > products. > > If this email contains any promotional content that you do not wish > to receive, > please reply to the original sender and write "Don't email promotional > material" in the subject. > > > [[alternative HTML version deleted]] > > ______________________________________________ > R-help at r-project.org mailing list > https://stat.ethz.ch/mailman/listinfo/r-help > PLEASE do read the posting guide http://www.R-project.org/posting-guide.html > and provide commented, minimal, self-contained, reproducible code.David Winsemius, MD Alameda, CA, USA
Alan, More RAM will definitely help. But if you have an object needing more than 2^31-1 ~ 2 billion elements, you'll hit a wall regardless. This could be particularly limiting for matrices. It is less limiting for data.frame objects (where each column could be 2 billion elements). But many R analytics under the hood use matrices, so you may not know up front where you could hit a limit. Jay ---- Original message ---- I have a Windows Server 2008 R2 Enterprise machine, with 64bit R installed running on 2 x Quad-core Intel Xeon 5500 processor with 24GB DDR3 1066 Mhz RAM. I am seeking to analyse very large data sets (perhaps as much as 10GB), without the addtional coding overhead of a package such as bigmemory(). My question is this - if we were to increase the RAM on the machine to (say) 128GB, would this become a possibility? I have read the documentation on memory limits and it seems so, but would like some additional confirmation before investing in any extra RAM. --------------------------------------------- -- John W. Emerson (Jay) Associate Professor of Statistics, Adjunct, and Acting Director of Graduate Studies Department of Statistics Yale University http://www.stat.yale.edu/~jay [[alternative HTML version deleted]]