davidkat at davidkatzconsulting.com
2007-Mar-06 20:43 UTC
[R] Memory Limits in Ubuntu Linux
I am an R user trying to get around the 2Gig memory limit in Windows, so here I am days later with a working Ubuntu, and R under Ubuntu. But - the memory problems seem worse than ever. R code that worked under windows fails, unable to allocate memory. Searching around the web, it appears that the problem may be the ability to find contguous memory for my big vectors, but a fresh boot of Ubuntu does not help either. Which way to go? 1) Try to install 64-bit version for bigger address space. Would this help? Is this workable for my Athlon 64 Dual-core? (the live cd seems to work but I never got it to boot after a disk install, but then the 386 version was no better until I learned more about Grub...I could try again if this might solve the problem) 2) Recompile R to get bigger memory capability? (I'll have to cross-post to some R forums too) This will be a challenge for a Linux newbie...like me. 3) Any other suggestions? My goal is to create a bigger neural network than fits in my Windows R version. -- David Katz www.davidkatzconsulting.com 541 482-1137 [[alternative HTML version deleted]]
Take a look at Windows FAQ 2.9. Following the instructions there, I was able to make WinXP use at least 3GB of RAM (physical RAM installed) with Rgui.exe. -Christos Christos Hatzis, Ph.D. Nuvera Biosciences, Inc. 400 West Cummings Park Suite 5350 Woburn, MA 01801 Tel: 781-938-3830 www.nuverabio.com> -----Original Message----- > From: r-help-bounces at stat.math.ethz.ch > [mailto:r-help-bounces at stat.math.ethz.ch] On Behalf Of > davidkat at davidkatzconsulting.com > Sent: Tuesday, March 06, 2007 3:44 PM > To: r-help at stat.math.ethz.ch > Subject: [R] Memory Limits in Ubuntu Linux > > I am an R user trying to get around the 2Gig memory limit in > Windows, so here I am days later with a working Ubuntu, and R > under Ubuntu. But - the memory problems seem worse than ever. > R code that worked under windows fails, unable to allocate memory. > > Searching around the web, it appears that the problem may be > the ability to find contguous memory for my big vectors, but > a fresh boot of Ubuntu does not help either. > > Which way to go? > > 1) Try to install 64-bit version for bigger address space. > Would this help? Is this workable for my Athlon 64 Dual-core? > (the live cd seems to work but I never got it to boot after a > disk install, but then the 386 version was no better until I > learned more about Grub...I could try again if this might solve the > problem) > > 2) Recompile R to get bigger memory capability? (I'll have to > cross-post to some R forums too) This will be a challenge for > a Linux newbie...like me. > > 3) Any other suggestions? My goal is to create a bigger > neural network than fits in my Windows R version. > -- > David Katz > www.davidkatzconsulting.com > 541 482-1137 > > [[alternative HTML version deleted]] > > ______________________________________________ > R-help at stat.math.ethz.ch mailing list > https://stat.ethz.ch/mailman/listinfo/r-help > PLEASE do read the posting guide > http://www.R-project.org/posting-guide.html > and provide commented, minimal, self-contained, reproducible code. > >
davidkat at davidkatzconsulting.com wrote:> I am an R user trying to get around the 2Gig memory limit in Windows, so > here I am days later with a working Ubuntu, and R under Ubuntu. But - the > memory problems seem worse than ever. R code that worked under > windows fails, unable to allocate memory. > > Searching around the web, it appears that the problem may be the ability to > find contguous memory for my big vectors, but a fresh boot of Ubuntu does > not help either. > > Which way to go? > > 1) Try to install 64-bit version for bigger address space. Would this help? Is > this workable for my Athlon 64 Dual-core? (the live cd seems to work but I > never got it to boot after a disk install, but then the 386 version was no better > until I learned more about Grub...I could try again if this might solve the > problem)If you really have got such amounts of RAM in that machine, it should be worth trying. Uwe Ligges> 2) Recompile R to get bigger memory capability? (I'll have to cross-post to > some R forums too) > This will be a challenge for a Linux newbie...like me. > > 3) Any other suggestions? My goal is to create a bigger neural network than > fits in my Windows R version.
On 6 March 2007 at 12:43, davidkat at davidkatzconsulting.com wrote: | I am an R user trying to get around the 2Gig memory limit in Windows, so The real limit on 32bit systems is a 3gb address space. R under Windows can get there, see the R-Windows FAQ. | here I am days later with a working Ubuntu, and R under Ubuntu. But - the | memory problems seem worse than ever. R code that worked under | windows fails, unable to allocate memory. Well, maybe you had virtual memory enabled under Windows but not under Ubuntu. Or maybe you had other memory-hungry applications up under Ubuntu. There is only so much magic the OS can do. You easiest remedy will be to upgrade to 4gb. And even 8gb can useful on 32bit system, despite the fact that each individual address space can only max out at 3gb, as you may have multi-core / multi-cpu systems that allow you to multitask better. | Which way to go? | | 1) Try to install 64-bit version for bigger address space. Would this help? Yes, but you'd probably would have to buy more ram to. The main advantage is that your limit is now way above the 3gb -- and probably set by your hardware or budget. Maybe it is as high as 16gb. But again, on the _same_ box with the _same_ amount of ram that is already constrained under 32bit, you will not see any improvement. Rather the opposite as the basic building block is now 8 bytes instead of 4, you will need more memory for the same tasks. No free lunch, as they say. | 2) Recompile R to get bigger memory capability? Nope. It's what you give your OS in terms of RAM what's binding here. | 3) Any other suggestions? Different algorithms or approaches, tricks like the data.frame-in-sqlite or biglm, ... Dirk -- Hell, there are no rules here - we're trying to accomplish something. -- Thomas A. Edison
David, I wouldn't give up on windows so fast. Many people have gotten the 3Gb switch to work. One used to have to modify the header of the Rgui.exe program to use the switch, but now the binary comes ready for that, so its really quite easy. I would like to hear more about why its not working for you. As for Linux, I use FC5 for which there is a 64-bit binary. But there are also 64-bit binaries for other distros. The 32-bit and 64-bit binaries are in different directories, so you should have no trouble telling them apart. I have heard good things about Ubuntu--mainly that its very easy to use--but FC5 has been pretty easy to learn too and I use the KDE desktop which gives me Kate as a text editor. You can open a terminal window in Kate to run R and set up a key like F10 to send the code from the editor to R. Its not quite as good as my Windows setup with Tinn-R, but almost as good. Thanks, Roger -----Original Message----- From: davidkat at davidkatzconsulting.com [mailto:davidkat at davidkatzconsulting.com] Sent: Tuesday, March 06, 2007 5:37 PM To: Bos, Roger Subject: RE: [R] Memory Limits in Ubuntu Linux Thanks for your prompt reply! The windows 3GB switch is quite problematic - it was not useable on my machine, and there are comments about these problems around the net. Thus, on to Linux. My machine has 4Gig, and some megabytes are grabbed by my Asus motherboard, leaving some 3.56 Gig. So if I understand your suggestion, try the 64-bit version of Ubuntu (based on Debian but I had better luck with the video part of the install) and then use the corresponding image from CRAN. My fear is that the CRAN Ubuntu version might be 32-bit - any idea how to find out before I embark on another install? Which Linux do you have - you described some significant success with getting large jobs to run. And yes, I've worked hard to save memory by tweaking the code. Thanks again. On 6 Mar 2007 at 16:51, Bos, Roger wrote:> David, > > First of all, under Windows you can get about 3GB available to R by > using the /3Gb switch in your boot.ini file, assuming you have 4Gb of > memory installed on your windows machine. Using that method, I have > seen the memory using of my R process get as big as 2.7Gb in task > manager. What's important, of course, is contiguous space, as you > mentioned. There, you may want to check your code closely and make > sure that its memory usage is as efficient as possible and you are > storing the minimal amount you need for each run. If you don't need > an object for a while consider writing it to disk and reading it backin later.> > Second, AFAIK to get any benefit from more memory is Linux you have to> go to the 64bit version. I am a Linux newbie too, so I choose to use > one of the pre-compiled binaries available on CRAN. In other words, > you shouldn't have to compile anything yourself. How much memory do > you have on your Linux box? I have 16Gb and I know I have ran stuff > that wouldn't run on my 4Gb windows box. > > HTH, > > Roger > > > > > > > > -----Original Message----- > From: r-help-bounces at stat.math.ethz.ch > [mailto:r-help-bounces at stat.math.ethz.ch] On Behalf Of > davidkat at davidkatzconsulting.com > Sent: Tuesday, March 06, 2007 3:44 PM > To: r-help at stat.math.ethz.ch > Subject: [R] Memory Limits in Ubuntu Linux > > I am an R user trying to get around the 2Gig memory limit in Windows, > so here I am days later with a working Ubuntu, and R under Ubuntu. But> - the memory problems seem worse than ever. R code that worked under > windows fails, unable to allocate memory. > > Searching around the web, it appears that the problem may be the > ability to find contguous memory for my big vectors, but a fresh boot > of Ubuntu does not help either. > > Which way to go? > > 1) Try to install 64-bit version for bigger address space. Would this > help? Is this workable for my Athlon 64 Dual-core? (the live cd seems > to work but I never got it to boot after a disk install, but then the > 386 version was no better until I learned more about Grub...I could > try again if this might solve the > problem) > > 2) Recompile R to get bigger memory capability? (I'll have to > cross-post to some R forums too) This will be a challenge for a Linux > newbie...like me. > > 3) Any other suggestions? My goal is to create a bigger neural network> than fits in my Windows R version. > -- > David Katz > www.davidkatzconsulting.com > 541 482-1137 > > [[alternative HTML version deleted]] > > ______________________________________________ > R-help at stat.math.ethz.ch mailing list > https://stat.ethz.ch/mailman/listinfo/r-help > PLEASE do read the posting guide > http://www.R-project.org/posting-guide.html > and provide commented, minimal, self-contained, reproducible code. > > **********************************************************************> * This message is for the named person's use only. It may contain > confidential, proprietary or legally privileged information. No right > to confidential or privileged treatment of this message is waived or > lost by any error in transmission. If you have received this message > in error, please immediately notify the sender by e-mail, delete the > message and all copies from your system and destroy any hard copies. > You must not, directly or indirectly, use, disclose, distribute, print> or copy any part of this message if you are not the intended > recipient. > ********************************************************************** >-- David Katz www.davidkatzconsulting.com ********************************************************************** * This message is for the named person's use only. It may contain confidential, proprietary or legally privileged information. No right to confidential or privileged treatment of this message is waived or lost by any error in transmission. If you have received this message in error, please immediately notify the sender by e-mail, delete the message and all copies from your system and destroy any hard copies. You must not, directly or indirectly, use, disclose, distribute, print or copy any part of this message if you are not the intended recipient.