Hi netters, I'm using the 64-bit R-2.5.0 on a x86-64 cpu, with an RAM of 2 GB. The operating system is SUSE 10. The system information is: -uname -a Linux someone 2.6.13-15.15-smp #1 SMP Mon Feb 26 14:11:33 UTC 2007 x86_64 x86_64 x86_64 GNU/Linux I used heatmap to process a matrix of the dim [16000,100]. After 3 hours of desperating waiting, R told me: cannot allocate vector of size 896 MB. I know the matrix is very big, but since I have 2 GB of RAM and in a 64-bit system, there should be no problem to deal with a vector smaller than 1 GB? (I was not running any other applications in my system) Does anyone know what's going on? Is there a hardware limit where I have to add more RAM, or is there some way to resolve it softwarely? Also is it possible to speed up the computing (I don't wanna wait another 3 hours to know I get another error message) Thank you in advance! _________________________________________________________________ ?????????????????????????????? MSN Hotmail?? http://www.hotmail.com
Are you paging? That might explain the long run times. How much space are your other objects taking up? The matrix by itself should only require about 13MB if it is numeric. I would guess it is some of the other objects that you have in your working space. Put some gc() in your loop to see how much space is being used. Run it with a subset of the data and see how long it takes. This might give you an estimate of the time, and space, that might be needed for the entire dataset. Do a 'ps' to see how much memory your process is using. Do one every couple of minutes to see if it is growing. You can alway use Rprof() to get an idea of where time is being spent (use it on a small subset). On 7/18/07, zhihua li <lzhtom at hotmail.com> wrote:> Hi netters, > > I'm using the 64-bit R-2.5.0 on a x86-64 cpu, with an RAM of 2 GB. The > operating system is SUSE 10. > The system information is: > -uname -a > Linux someone 2.6.13-15.15-smp #1 SMP Mon Feb 26 14:11:33 UTC 2007 x86_64 > x86_64 x86_64 GNU/Linux > > I used heatmap to process a matrix of the dim [16000,100]. After 3 hours > of desperating waiting, R told me: > cannot allocate vector of size 896 MB. > > I know the matrix is very big, but since I have 2 GB of RAM and in a 64-bit > system, there should be no problem to deal with a vector smaller than 1 GB? > (I was not running any other applications in my system) > > Does anyone know what's going on? Is there a hardware limit where I have > to add more RAM, or is there some way to resolve it softwarely? Also is it > possible to speed up the computing (I don't wanna wait another 3 hours to > know I get another error message) > > Thank you in advance! > > _________________________________________________________________ > ?????????????????????????????? MSN Hotmail?? http://www.hotmail.com > > > ______________________________________________ > R-help at stat.math.ethz.ch mailing list > https://stat.ethz.ch/mailman/listinfo/r-help > PLEASE do read the posting guide http://www.R-project.org/posting-guide.html > and provide commented, minimal, self-contained, reproducible code. > >-- Jim Holtman Cincinnati, OH +1 513 646 9390 What is the problem you are trying to solve?
You might try running top while R runs, to get a better idea of what is happening. 64-bit R takes more memory than 32-bit (longer pointers) and for a large problem I would say that 2GB RAM is a minimum if you want any speed. Slowness is likely related to needing to use swap space. The "cannot allocate" error is because you run out of both RAM and swap. If you are close to finishing your calculation you may resolve things by increasing swap, but don't expect it to be fast. There is also a possibility that your user id is restricted, but I'm not sure how that works anymore. It used to be controlled by ulimit, but that does not seem to be the case in newer versions of Linux. If you are still debugging your code, and there is some chance you are just gobbling up memory endlessly until it runs out, then you can speed things up (i.e. fail more quickly) by turning swap off. There are debugging situations where this turns out to be useful. HTH, Paul zhihua li wrote:> Hi netters, > > I'm using the 64-bit R-2.5.0 on a x86-64 cpu, with an RAM of 2 GB. The > operating system is SUSE 10. > The system information is: -uname -a > Linux someone 2.6.13-15.15-smp #1 SMP Mon Feb 26 14:11:33 UTC 2007 > x86_64 x86_64 x86_64 GNU/Linux > > I used heatmap to process a matrix of the dim [16000,100]. After 3 > hours of desperating waiting, R told me: > cannot allocate vector of size 896 MB. > > I know the matrix is very big, but since I have 2 GB of RAM and in a > 64-bit system, there should be no problem to deal with a vector smaller > than 1 GB? (I was not running any other applications in my system) > > Does anyone know what's going on? Is there a hardware limit where I > have to add more RAM, or is there some way to resolve it softwarely? > Also is it possible to speed up the computing (I don't wanna wait > another 3 hours to know I get another error message) > > Thank you in advance! > > _________________________________________________________________ > ?????????????????????????????? MSN Hotmail?? http://www.hotmail.com > > > ------------------------------------------------------------------------ > > ______________________________________________ > R-help at stat.math.ethz.ch mailing list > https://stat.ethz.ch/mailman/listinfo/r-help > PLEASE do read the posting guide http://www.R-project.org/posting-guide.html > and provide commented, minimal, self-contained, reproducible code.-------------- next part -------------- =================================================================================== La version fran?aise suit le texte anglais. ------------------------------------------------------------------------------------ This email may contain privileged and/or confidential inform...{{dropped}}
Reasonably Related Threads
- Error: evaluation nested too deeply when doing heatmap with binary distfunction
- working with R graphics remotely
- Comparison of aggregate in R and group by in mysql
- how to subset rows using regular expression patterns
- An R clause to bind dataframes under certain contions