Hello, I have several very large data sets (1-7 million observations, sometimes hundreds of variables) that I'm trying to work with in R, and memory seems to be a big issue. I'm currently using a 2 GB Windows setup, but might have the option to run R on a server remotely. Windows R seems basically limited to 2 GB memory if I'm right; is there the possibility to go much beyond that with server-based R? In other words, am I limited by R or by my hardware, and how much might R be able to handle if I get the hardware necessary? Also, any possibility of using web-based R for this kind of thing? Cheers, Alan Cohen Alan Cohen Post-doctoral Fellow Centre for Global Health Research 70 Richmond St. East, Suite 202A Toronto, ON M5C 1N8 Canada (416) 854-3121 (cell) (416) 864-6060 ext. 3156 (0ffice)
I believe that it depends on the operating system. memory.limit() I think this is in the FAQ and search the R email list archive On Wed, Nov 5, 2008 at 2:53 PM, Alan Cohen <CohenA at smh.toronto.on.ca> wrote:> Hello, > > I have several very large data sets (1-7 million observations, sometimes hundreds of variables) that I'm trying to work with in R, and memory seems to be a big issue. I'm currently using a 2 GB Windows setup, but might have the option to run R on a server remotely. Windows R seems basically limited to 2 GB memory if I'm right; is there the possibility to go much beyond that with server-based R? In other words, am I limited by R or by my hardware, and how much might R be able to handle if I get the hardware necessary? > > Also, any possibility of using web-based R for this kind of thing? > > Cheers, > Alan Cohen > > Alan Cohen > Post-doctoral Fellow > Centre for Global Health Research > 70 Richmond St. East, Suite 202A > Toronto, ON M5C 1N8 > Canada > (416) 854-3121 (cell) > (416) 864-6060 ext. 3156 (0ffice) > > ______________________________________________ > R-help at r-project.org mailing list > https://stat.ethz.ch/mailman/listinfo/r-help > PLEASE do read the posting guide http://www.R-project.org/posting-guide.html > and provide commented, minimal, self-contained, reproducible code. >-- Stephen Sefick Research Scientist Southeastern Natural Sciences Academy Let's not spend our time and resources thinking about things that are so little or so large that all they really do for us is puff us up and make us feel like gods. We are mammals, and have not exhausted the annoying little problems of being mammals. -K. Mullis
Alan Cohen wrote:> Hello, > > I have several very large data sets (1-7 million observations, sometimes hundreds of variables) that I'm trying to work with in R, and memory seems to be a big issue. I'm currently using a 2 GB Windows setup, but might have the option to run R on a server remotely. Windows R seems basically limited to 2 GB memory if I'm right; is there the possibility to go much beyond that with server-based R? In other words, am I limited by R or by my hardware, and how much might R be able to handle if I get the hardware necessary?Hi, R under Windows can handle up to 2GB, and under certain conditions up to 3GB. Note that no 64-bit version for Windows is available yet. Anyway, under 64-bit Linux their are almost no limits other than hardware restrictions. Best wishes, Uwe> Also, any possibility of using web-based R for this kind of thing?> Cheers, > Alan Cohen > > Alan Cohen > Post-doctoral Fellow > Centre for Global Health Research > 70 Richmond St. East, Suite 202A > Toronto, ON M5C 1N8 > Canada > (416) 854-3121 (cell) > (416) 864-6060 ext. 3156 (0ffice) > > ______________________________________________ > R-help at r-project.org mailing list > https://stat.ethz.ch/mailman/listinfo/r-help > PLEASE do read the posting guide http://www.R-project.org/posting-guide.html > and provide commented, minimal, self-contained, reproducible code.