Hello all, I've been working with R & Fridolin Wild's lsa package a bit over the past few months, but I'm still pretty much a novice. I have a lot of files that I want to use to create a semantic space. When I begin to run the initial textmatrix( ), it runs for about 3-4 hours and eventually gives me an error. It's always "ERROR: cannot allocate vector size of xxx Kb". I imagine this might be my computer running out of memory, but I'm sure. So I thought I would send this to community at large for any help/thoughts. I search the archives and didn't really find anything that specifically speaks to my situation. So I guess I have s few questions. First, is this actually an issue with the machine running out of memory? If not, what might be the cause for the error? If so, is there a way to minimize the amount of memory used by the vector data structures (e.g., Berkeley DB)? Thanks, Gabe Wingfield IT and Program Specialist I Center for Applied Social Research University of Oklahoma 2 Partners Place 3100 Monitor, Suite 100 Norman, OK 73072 [[alternative HTML version deleted]]
Yes, your error is due to running out of memory. This is probably one of the most frequent questions asked here, so if you search again you can find a lot of advice on how to get around it. As you learn more about R programming you will learn how to store data more efficiently, rm() to remove variables you no longer need, gc() to garbage collect and free up memory. Try to open only the files you need, do some of the analysis, then get rid of everything you don't need, then do some more analysis. Thanks, Roger -----Original Message----- From: r-help-bounces at stat.math.ethz.ch [mailto:r-help-bounces at stat.math.ethz.ch] On Behalf Of Wingfield, Jerad G. Sent: Thursday, March 15, 2007 12:27 PM To: r-help at stat.math.ethz.ch Subject: [R] Cannot allocate vector size of... ? Hello all, I've been working with R & Fridolin Wild's lsa package a bit over the past few months, but I'm still pretty much a novice. I have a lot of files that I want to use to create a semantic space. When I begin to run the initial textmatrix( ), it runs for about 3-4 hours and eventually gives me an error. It's always "ERROR: cannot allocate vector size of xxx Kb". I imagine this might be my computer running out of memory, but I'm sure. So I thought I would send this to community at large for any help/thoughts. I search the archives and didn't really find anything that specifically speaks to my situation. So I guess I have s few questions. First, is this actually an issue with the machine running out of memory? If not, what might be the cause for the error? If so, is there a way to minimize the amount of memory used by the vector data structures (e.g., Berkeley DB)? Thanks, Gabe Wingfield IT and Program Specialist I Center for Applied Social Research University of Oklahoma 2 Partners Place 3100 Monitor, Suite 100 Norman, OK 73072 [[alternative HTML version deleted]] ______________________________________________ R-help at stat.math.ethz.ch mailing list https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide http://www.R-project.org/posting-guide.html and provide commented, minimal, self-contained, reproducible code. ********************************************************************** * This message is for the named person's use only. It may contain confidential, proprietary or legally privileged information. No right to confidential or privileged treatment of this message is waived or lost by any error in transmission. If you have received this message in error, please immediately notify the sender by e-mail, delete the message and all copies from your system and destroy any hard copies. You must not, directly or indirectly, use, disclose, distribute, print or copy any part of this message if you are not the intended recipient.
Oops. Yep, I totally forgot my specs and such. I'm currently running R-2.4.1 on a 64-bit Linux box (Fedora Core 6) with 4GB of RAM. The files are 10-50Kb on average, but this error came about when only working with ~16,000 of them. The final size of the corpus is ~1.7M files. So, obviously, this memory thing is going to be a large issue for me. I'm going through re-searching the help list archives and now it looks like I have S Poetry to read as well. Thanks for all the suggestions. Any others are greatly appreciated as well. Gabe Wingfield IT and Program Specialist I Center for Applied Social Research University of Oklahoma 2 Partners Place 3100 Monitor, Suite 100 Norman, OK 73072 -----Original Message----- From: Patrick Burns [mailto:pburns at pburns.seanet.com] Sent: Thursday, March 15, 2007 12:31 PM To: Wingfield, Jerad G. Subject: Re: [R] Cannot allocate vector size of... ? You can find a few things not to do (things that waste memory) in S Poetry. You don't say how much memory your machine has, nor how big your objects are. However, it is possible that getting more memory for your machine might be the best thing to do. Patrick Burns patrick at burns-stat.com +44 (0)20 8525 0696 http://www.burns-stat.com (home of S Poetry and "A Guide for the Unwilling S User") Wingfield, Jerad G. wrote:>Hello all, > > > >I've been working with R & Fridolin Wild's lsa package a bit over the >past few months, but I'm still pretty much a novice. I have a lot of >files that I want to use to create a semantic space. When I begin torun>the initial textmatrix( ), it runs for about 3-4 hours and eventually >gives me an error. It's always "ERROR: cannot allocate vector size of >xxx Kb". I imagine this might be my computer running out of memory, but >I'm sure. So I thought I would send this to community at large for any >help/thoughts. > > > >I search the archives and didn't really find anything that specifically >speaks to my situation. So I guess I have s few questions. First, is >this actually an issue with the machine running out of memory? If not, >what might be the cause for the error? If so, is there a way tominimize>the amount of memory used by the vector data structures (e.g., Berkeley >DB)? > > > >Thanks, > >Gabe Wingfield > >IT and Program Specialist I > >Center for Applied Social Research > >University of Oklahoma > >2 Partners Place > >3100 Monitor, Suite 100 > >Norman, OK 73072 > > > [[alternative HTML version deleted]] > >______________________________________________ >R-help at stat.math.ethz.ch mailing list >https://stat.ethz.ch/mailman/listinfo/r-help >PLEASE do read the posting guidehttp://www.R-project.org/posting-guide.html>and provide commented, minimal, self-contained, reproducible code. > > > >