Neotropical bat risk assessments
2012-Mar-22 15:26 UTC
[R] Suggestions for RAM for new box
Hi all, My main Dell melted down a month ago and I need to buy a new workstation. It will be Win7 64 bit and fast graphics card with large RAM. For GIS (ArcGis) I am opting for 6 GB RAM. Is there any advantage for R to boost this to 8 GB or will 6 GB be sufficient for R applications? There have been a lot of discussion about "Big Memory" issues with R.
On Mar 22, 2012, at 10:26 AM, Neotropical bat risk assessments wrote:> Hi all, > > My main Dell melted down a month ago and I need to buy a new workstation. > It will be Win7 64 bit and fast graphics card with large RAM. > > For GIS (ArcGis) I am opting for 6 GB RAM. > Is there any advantage for R to boost this to 8 GB or will 6 GB be sufficient for R applications? > There have been a lot of discussion about "Big Memory" issues with R.I might note that these days, 6Gb is not "large RAM". I have 8Gb in my now 3 year old dual core MacBook Pro laptop running OSX Lion (64 bit). The keys to consider are that even with a 64 bit build of R, you are still going to be limited to creating/accessing objects where the indexing of those objects will be limited to a 32 bit signed integer (2^31 - 1). So things like vectors/matrices/arrays will be limited to that size in terms of number of elements. In addition, based upon comments in the R Admin Manual (http://cran.r-project.org/doc/manuals/R-admin.html#Choosing-between-32_002d-and-64_002dbit-builds), 64 bit builds of R will have a memory block allocation limit of about 8Gb. In other words, a single R object can be no larger than about 8Gb. Within those limits of course, if you run out of physical RAM, you end up swapping to disk, which slows things down notably. So with 6Gb of RAM, after the OS and other applications consume some portion of that, you have what is left over for R, which may or may not be enough for your specific needs. The "Big Memory" issues (see http://cran.r-project.org/web/views/HighPerformanceComputing.html) typically address situations where the above constraints come into play. RAM is sufficiently cheap these days, that going to 8Gb or even 16gb should not be prohibitively expensive and if you need to deal with "large" objects in R, that is something to consider. Hard to provide specific guidance without having a sense of the size of the datasets that you will be working with, along with the nature of the operations on those datasets, which might require the copying of those datasets in RAM during analyses. That being said, get as much RAM as your budget will allow for. Regards, Marc Schwartz