Hi I am trying to do a large glm and running into this message. Error: cannot allocate vector of size 3725426 Kb In addition: Warning message: Reached total allocation of 494Mb: see help(memory.size) Am I simply out of memory (I only have .5 gig)? Is there something I can do? Stephen [[alternative HTML version deleted]]
Stephen Choularton wrote:> Hi > > I am trying to do a large glm and running into this message. > > Error: cannot allocate vector of size 3725426 Kb > In addition: Warning message: > Reached total allocation of 494Mb: see help(memory.size) > > Am I simply out of memory (I only have .5 gig)? > > Is there something I can do?You have to rethink whether the analyses you are doing is sensible this way, or whether you can respecify things. R claims to need almost 4Gb(!) for the next memory allocation step, so you will get in trouble even on huge machines.... Uwe Ligges> Stephen > > [[alternative HTML version deleted]] > > ______________________________________________ > R-help at stat.math.ethz.ch mailing list > https://stat.ethz.ch/mailman/listinfo/r-help > PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html
Hello, Am Mittwoch, 16. Februar 2005 20:48 schrieb Stephen Choularton:> Hi > > I am trying to do a large glm and running into this message. > > Error: cannot allocate vector of size 3725426 Kb > In addition: Warning message: > Reached total allocation of 494Mb: see help(memory.size) > > Am I simply out of memory (I only have .5 gig)? > > Is there something I can do?This question has been answered a hundred times on this list. The best idea is trying to search for "memory datasets" in the lists mail archive . Which gives you 246 hits! Please give some more information on what system are you using (32, 64 bit) etc. Regards Thomas