Hi Folks, I've been bumping my head against the 4GB limit for 32-bit R. I can't go to 64-bit R due to package compatibility issues (ROBDC - possible but painful, xlsReadWrite - not possible, and others). I have a number of big dataframes whose columns all sorts of data types - factor, character, integer, etc. I run and save models that keep copies of the modeled data inside the model objects as well (mle2 objects, to be specific). I'm searching for a way to cache some of these dataframes and objects to virtual memory (I think I'm using the right terminology...). I've read around, and while bigmemory and ff and the like would likely suit my purposes if I were just dealing with numeric matricies, I'm dealing with dataframes and objects. Any thoughts would be greatly appreciated! Thanks, Allie
System Info: R 2.14.2 Windows 7 Pro x64 SP1 8GB RAM On 10/18/2012 3:42 PM, Alexander Shenkin wrote:> Hi Folks, > > I've been bumping my head against the 4GB limit for 32-bit R. I can't > go to 64-bit R due to package compatibility issues (ROBDC - possible but > painful, xlsReadWrite - not possible, and others). I have a number of > big dataframes whose columns all sorts of data types - factor, > character, integer, etc. I run and save models that keep copies of the > modeled data inside the model objects as well (mle2 objects, to be > specific). > > I'm searching for a way to cache some of these dataframes and objects to > virtual memory (I think I'm using the right terminology...). I've read > around, and while bigmemory and ff and the like would likely suit my > purposes if I were just dealing with numeric matricies, I'm dealing with > dataframes and objects. > > Any thoughts would be greatly appreciated! > > Thanks, > Allie > > ______________________________________________ > R-help at r-project.org mailing list > https://stat.ethz.ch/mailman/listinfo/r-help > PLEASE do read the posting guide http://www.R-project.org/posting-guide.html > and provide commented, minimal, self-contained, reproducible code. >
i believe ff has a dataframe class. as for your object data im less clear. how big is it On Oct 18, 2012 12:45 PM, "Alexander Shenkin" <ashenkin@ufl.edu> wrote:> Hi Folks, > > I've been bumping my head against the 4GB limit for 32-bit R. I can't > go to 64-bit R due to package compatibility issues (ROBDC - possible but > painful, xlsReadWrite - not possible, and others). I have a number of > big dataframes whose columns all sorts of data types - factor, > character, integer, etc. I run and save models that keep copies of the > modeled data inside the model objects as well (mle2 objects, to be > specific). > > I'm searching for a way to cache some of these dataframes and objects to > virtual memory (I think I'm using the right terminology...). I've read > around, and while bigmemory and ff and the like would likely suit my > purposes if I were just dealing with numeric matricies, I'm dealing with > dataframes and objects. > > Any thoughts would be greatly appreciated! > > Thanks, > Allie > > ______________________________________________ > R-help@r-project.org mailing list > https://stat.ethz.ch/mailman/listinfo/r-help > PLEASE do read the posting guide > http://www.R-project.org/posting-guide.html > and provide commented, minimal, self-contained, reproducible code. >[[alternative HTML version deleted]]
Hi Allie, When you are working with the ff package, the counterpart of a data.frame is called an ffdf (ff data frame). It can handle the types you are talking about - factor, integer but characters will be stored as factors. So this means that your data types do not have to be of 1 specific type. Good luck in trying out the package. Jan -- View this message in context: http://r.789695.n4.nabble.com/bigmemory-for-dataframes-tp4646680p4646749.html Sent from the R help mailing list archive at Nabble.com.