Dear R users, I have been using a dynamic data extraction from raw files strategy at the moment, but it takes a long long time. In order to save time, I am planning to generate a data set of size 1500 x 20000 with each data point a 9-digit decimal number, in order to save my time. I know R is limited to 2^31-1 and that my data set is not going to exceed this limit. But my laptop only has 2 Gb and is running 32-bit Windows / XP or Vista. I ran into R memory problem issue before. Please let me know your opinion according to your experience. Thanks a lot! - John
If your 1500 X 20000 matrix is all numeric, it should take up about 240MB of memory. That should easily fit within the 2GB of your laptop and still leave room for several copies that might arise during the processing. Exactly what are you going to be doing with the data? A lot will depend on the functions/procedures that you will be calling, or the type of transformations you might be doing. On Tue, May 19, 2009 at 11:59 PM, tsunhin wong <thjwong@gmail.com> wrote:> Dear R users, > > I have been using a dynamic data extraction from raw files strategy at > the moment, but it takes a long long time. > In order to save time, I am planning to generate a data set of size > 1500 x 20000 with each data point a 9-digit decimal number, in order > to save my time. > I know R is limited to 2^31-1 and that my data set is not going to > exceed this limit. But my laptop only has 2 Gb and is running 32-bit > Windows / XP or Vista. > > I ran into R memory problem issue before. Please let me know your > opinion according to your experience. > Thanks a lot! > > - John > > ______________________________________________ > R-help@r-project.org mailing list > stat.ethz.ch/mailman/listinfo/r-help > PLEASE do read the posting guide > R-project.org/posting-guide.html<r-project.org/posting-guide.html> > and provide commented, minimal, self-contained, reproducible code. >-- Jim Holtman Cincinnati, OH +1 513 646 9390 What is the problem that you are trying to solve? [[alternative HTML version deleted]]
On Tue, May 19, 2009 at 11:59 PM, tsunhin wong <thjwong@gmail.com> wrote:> In order to save time, I am planning to generate a data set of size > 1500 x 20000 with each data point a 9-digit decimal number, in order > to save my time. > I know R is limited to 2^31-1 and that my data set is not going to > exceed this limit. But my laptop only has 2 Gb and is running 32-bit > Windows / XP or Vista. >32-bit R on Windows XP with 2GB RAM has no problem with a matrix this size (not just integers, but also numerics):> system.time(mm <- matrix( numeric(1500 * 20000), 1500, 20000))user system elapsed 0.59 0.23 1.87> system.time(nn <- matrix( runif(1500 * 20000), 1500, 20000))user system elapsed 2.66 0.64 13.39> system.time(oo <- nn + 3)user system elapsed 0.24 0.17 0.41> system.time(pp <- oo - oo)user system elapsed 0.15 0.13 0.28 [[alternative HTML version deleted]]