I am running a VERY LARGE regression (many factors, many rows of data) using LM. I think I have my memory set as high as possible. ( I ran "memory.limit(size = 4000)") Is there anything I can do? ( FYI, I "think" I have removed all data I am not using, and I "think" I have only the data needed for the regression loaded.) Thanks. --------------------------------- [[alternative HTML version deleted]]
Prof Brian Ripley
2005-Nov-23 09:06 UTC
[R] running out of memory while running a VERY LARGE regression
On Tue, 22 Nov 2005, t c wrote:> I am running a VERY LARGE regression (many factors, many rows of data) > using LM. > > I think I have my memory set as high as possible. ( I ran > "memory.limit(size = 4000)") > > Is there anything I can do? ( FYI, I "think" I have removed all data I > am not using, and I "think" I have only the data needed for the > regression loaded.) Thanks.Snce you mention memory.limit, I guess you are using Windows without telling us. If so, have you set up the /3GB switch (see the rw-FAQ Q2.9) and modified the R executables? (The modification is not necessary if you use the current R-patched available from CRAN.) You will be able to save memory by using lm.fit rather than lm, perhaps running a session containing just the model matrix and the response. (Unless of course you run out of memory forming the model matrix.) The best answer is to use a 64-bit OS and a 64-bit build of R.> [[alternative HTML version deleted]]> PLEASE do read the posting guide! http://www.R-project.org/posting-guide.htmlPlease do as it asks and tell us your OS and do not send HTML mail and report the exact problem with the error messages. -- Brian D. Ripley, ripley at stats.ox.ac.uk Professor of Applied Statistics, http://www.stats.ox.ac.uk/~ripley/ University of Oxford, Tel: +44 1865 272861 (self) 1 South Parks Road, +44 1865 272866 (PA) Oxford OX1 3TG, UK Fax: +44 1865 272595
Seemingly Similar Threads
- Cannot allocate large vectors (running out of memory?)
- Easy method to set user-mode virtual memory space in Windows Vista and 7
- RODBC, optimizing memory, "Error: cannot allocate vector of size 522 Kb".
- Memory available to 32-bit R app on 64-bit machine
- Memory problems with large dataset in rpart