On Thu, 21 Oct 1999, Manuel wrote:> I hope that someone has had a similar trouble and will be able to > help us : > > We , have installed the R package in a Digital Workstation with 500Mb > of > RAM memory, running under Unix operating system. The package works fine > > but when we try to start the program with more than 120Mb, (vsize > - --120M) the > workstation refuses to allocate this memory. The message that we get > is: > Fatal error: Could not allocate memory for vector heap. > > Someone told us that the solution was an appropiate ulimit call, but > when we do ulimit -a we get only a number 1048576. We figure out that > this number > can be the data segment size. > When we do > ulimit -d unlimited > ulimit -s unlimited > ulimit -m unlimited > ulimit -v unlimited > we get the following mesage: > Requested ulimit exceed hard limit. We think that this mean that we have > > no limit to > the amount of memory that can be allocated. > > We have installed the same version of the program under linux (Redhat > 6.0) and we > were also unable to allocate more than 120 M. > > I would be very grateful if someone could let us some new advise in > order to solve the > problem. > > Note: I don`t know if the R package is able to allocate more than 120M. > We need about 250 M of memory because currently we are dealing with > problems in hight dimension.I have no problem allocating --vsize 250M using R0.65.1 on either Debian GNU/Linux or Solaris 2.7. In fact, I can allocate --vsize 1000M under Solaris, which is substantially larger than physical memory wompom% ~/Rarchive/R --vsize 1000M then R> gc() free total Ncells 128269 250000 Vcells 131024950 131072000 Thomas Lumley Assistant Professor, Biostatistics University of Washington, Seattle -.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.- r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html Send "info", "help", or "[un]subscribe" (in the "body", not the subject !) To: r-help-request at stat.math.ethz.ch _._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._
I hope that someone has had a similar trouble and will be able to help us : We , have installed the R package in a Digital Workstation with 500Mb of RAM memory, running under Unix operating system. The package works fine but when we try to start the program with more than 120Mb, (vsize - --120M) the workstation refuses to allocate this memory. The message that we get is: Fatal error: Could not allocate memory for vector heap. Someone told us that the solution was an appropiate ulimit call, but when we do ulimit -a we get only a number 1048576. We figure out that this number can be the data segment size. When we do ulimit -d unlimited ulimit -s unlimited ulimit -m unlimited ulimit -v unlimited we get the following mesage: Requested ulimit exceed hard limit. We think that this mean that we have no limit to the amount of memory that can be allocated. We have installed the same version of the program under linux (Redhat 6.0) and we were also unable to allocate more than 120 M. I would be very grateful if someone could let us some new advise in order to solve the problem. Note: I don`t know if the R package is able to allocate more than 120M. We need about 250 M of memory because currently we are dealing with problems in hight dimension. Manuel. -.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.- r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html Send "info", "help", or "[un]subscribe" (in the "body", not the subject !) To: r-help-request at stat.math.ethz.ch _._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._
On Thu, 21 Oct 1999, you wrote:> I hope that someone has had a similar trouble and will be able to > help us : > > We , have installed the R package in a Digital Workstation with 500Mb > of > RAM memory, running under Unix operating system. The package works fine > > but when we try to start the program with more than 120Mb, (vsize > - --120M) the > workstation refuses to allocate this memory. The message that we get > is: > Fatal error: Could not allocate memory for vector heap. >Manuel you could try re-setting (read: raise into the stratosphere) the hard limits in the kernel: 1. Create 2 files, vm.stanza, proc.stanza: -----vm.stanza ------ vm: ubc-maxpercent=80 ubc-borrowpercent = 10 vm-maxwire = 104857600 vm-maxvas = 2134217728 vm-vpagemax = 16000 vm-syswiredpercent = 90 -----end ------ -----proc.stanza-------- proc: per-proc-stack-size = 222097152 max-per-proc-stack-size = 222097152 per-proc-data-size = 1134217728 max-per-proc-data-size = 2134217728 max-per-proc-address-space = 2134217728 per-proc-address-space = 2134217728 -----end------------- 2. Load these into the /etc/sysconfigtab file and the in-core memory copy with: # sysconfigdb -a -f vm.stanza vm # sysconfigdb -a -f proc.stanza proc # reboot If you already have a vm: or proc: section in /etc/sysconfigtab then replace the -a flag with -u, see the sysconfigdb man page. Make sure you have /sbin/swapdefault pointing to your /dev/rz0b swap partition, otherwise strange things may happen :) Hope this helps Ian -- Ian Thurlbeck http://www.stams.strath.ac.uk/ Statistics and Modelling Science, University of Strathclyde Livingstone Tower, 26 Richmond Street, Glasgow, UK, G1 1XH Tel: +44 (0)141 548 3667 Fax: +44 (0)141 552 2079 -.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.- r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html Send "info", "help", or "[un]subscribe" (in the "body", not the subject !) To: r-help-request at stat.math.ethz.ch _._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._