Hayes, Daniel
2008-Sep-02 13:47 UTC
[R] receiving "Error: cannot allocate vector of size 1.5 Gb"
Dear all, In my attempt to run the below modelling command in R 2.7.0 under windows XP (4GB RAM with /3GB switch set) I receive the following error: Error: cannot allocate vector of size 1.5 Gb I have searched a bit and have tried adding: --max-mem-size=3071M to the command line (when set to 3G I get the error that 3072M is too much) I also run:> memory.size()[1] 11.26125> memory.size(max=T)[1] 13.4375 Modelling script: model.females <- quote(gamlss(WAZ11~cs(sqrtage,df=12)+country, sigma.formula=~cs(sqrtage,df=3)+country, nu.formula=~cs(sqrtage,df=1), tau.formula=~cs(sqrtage,df=1), data=females, family=BCPE, control=con)) fit.females <- eval(model.females) the females (1,654KB) that is being modelled by the GAMLSS package contains 158,533 observations I have further installed various memory optimization programs under XP but to no avail. I believe that I perhaps need to set the Vcells and Ncells but am not sure which nor to what limits. Any other help in maximizing my RAM usage in R would be great I am quite a novice so please excuse any obvious mistakes or omissions. Thank you in advance for your help Dr. Daniel Hayes [[alternative HTML version deleted]]
Rory.WINSTON at rbs.com
2008-Sep-02 16:38 UTC
[R] receiving "Error: cannot allocate vector of size 1.5 Gb"
See http://cran.r-project.org/bin/windows/base/rw-FAQ.html#There-seems-to-be-a-limit-on-the-memory-it-uses_0021 Rory Winston RBS Global Banking & Markets Office: +44 20 7085 4476 -----Original Message----- From: r-help-bounces at r-project.org [mailto:r-help-bounces at r-project.org] On Behalf Of Hayes, Daniel Sent: 02 September 2008 14:48 To: r-help at r-project.org Subject: [R] receiving "Error: cannot allocate vector of size 1.5 Gb" Dear all, In my attempt to run the below modelling command in R 2.7.0 under windows XP (4GB RAM with /3GB switch set) I receive the following error: Error: cannot allocate vector of size 1.5 Gb I have searched a bit and have tried adding: --max-mem-size=3071M to the command line (when set to 3G I get the error that 3072M is too much) I also run:> memory.size()[1] 11.26125> memory.size(max=T)[1] 13.4375 Modelling script: model.females <- quote(gamlss(WAZ11~cs(sqrtage,df=12)+country, sigma.formula=~cs(sqrtage,df=3)+country, nu.formula=~cs(sqrtage,df=1), tau.formula=~cs(sqrtage,df=1), data=females, family=BCPE, control=con)) fit.females <- eval(model.females) the females (1,654KB) that is being modelled by the GAMLSS package contains 158,533 observations I have further installed various memory optimization programs under XP but to no avail. I believe that I perhaps need to set the Vcells and Ncells but am not sure which nor to what limits. Any other help in maximizing my RAM usage in R would be great I am quite a novice so please excuse any obvious mistakes or omissions. Thank you in advance for your help Dr. Daniel Hayes [[alternative HTML version deleted]] ______________________________________________ R-help at r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide http://www.R-project.org/posting-guide.html and provide commented, minimal, self-contained, reproducible code. *********************************************************************************** The Royal Bank of Scotland plc. Registered in Scotland No 90312. Registered Office: 36 St Andrew Square, Edinburgh EH2 2YB. Authorised and regulated by the Financial Services Authority This e-mail message is confidential and for use by the=2...{{dropped:22}}
Prof Brian Ripley
2008-Sep-02 17:28 UTC
[R] receiving "Error: cannot allocate vector of size 1.5 Gb"
Please study the rw-FAQ. With a 2GB address space your chance of getting a 1.5GB contiguous block is essentially zero. On Tue, 2 Sep 2008, Hayes, Daniel wrote:> Dear all, > > > > In my attempt to run the below modelling command in R 2.7.0 under windows XP (4GB RAM with /3GB switch set) I receive the following error: > > Error: cannot allocate vector of size 1.5 Gb > > > > I have searched a bit and have tried adding: --max-mem-size=3071M to the command line (when set to 3G I get the error that 3072M is too much) > > I also run: > >> memory.size() > > [1] 11.26125 > >> memory.size(max=T) > > [1] 13.4375 > > > > Modelling script: > > model.females <- quote(gamlss(WAZ11~cs(sqrtage,df=12)+country, sigma.formula=~cs(sqrtage,df=3)+country, > > nu.formula=~cs(sqrtage,df=1), tau.formula=~cs(sqrtage,df=1), > > data=females, family=BCPE, control=con)) > > fit.females <- eval(model.females) > > > > the females (1,654KB) that is being modelled by the GAMLSS package contains 158,533 observations > > I have further installed various memory optimization programs under XP but to no avail. > > I believe that I perhaps need to set the Vcells and Ncells but am not sure which nor to what limits. > > Any other help in maximizing my RAM usage in R would be great > > > > I am quite a novice so please excuse any obvious mistakes or omissions. > > Thank you in advance for your help > > > > Dr. Daniel Hayes > > > [[alternative HTML version deleted]] > > ______________________________________________ > R-help at r-project.org mailing list > https://stat.ethz.ch/mailman/listinfo/r-help > PLEASE do read the posting guide http://www.R-project.org/posting-guide.html > and provide commented, minimal, self-contained, reproducible code. >-- Brian D. Ripley, ripley at stats.ox.ac.uk Professor of Applied Statistics, http://www.stats.ox.ac.uk/~ripley/ University of Oxford, Tel: +44 1865 272861 (self) 1 South Parks Road, +44 1865 272866 (PA) Oxford OX1 3TG, UK Fax: +44 1865 272595
Seemingly Similar Threads
- Problem in building a package in R 2.0.0
- creating mulptiple new variables from one data.frame according to columns and rows in that frame
- documentation to upgrade R-package from 32 to 64bit
- Problems using gamlss to model zero-inflated and overdispersed count data: "the global deviance is increasing"
- gamlss results for EXP and LNO seem to have reversed AIC scores