If your 1500 X 20000 matrix is all numeric, it should take up about 240MB of
memory. That should easily fit within the 2GB of your laptop and still
leave room for several copies that might arise during the processing.
Exactly what are you going to be doing with the data? A lot will depend on
the functions/procedures that you will be calling, or the type of
transformations you might be doing.
On Tue, May 19, 2009 at 11:59 PM, tsunhin wong <thjwong@gmail.com> wrote:
> Dear R users,
>
> I have been using a dynamic data extraction from raw files strategy at
> the moment, but it takes a long long time.
> In order to save time, I am planning to generate a data set of size
> 1500 x 20000 with each data point a 9-digit decimal number, in order
> to save my time.
> I know R is limited to 2^31-1 and that my data set is not going to
> exceed this limit. But my laptop only has 2 Gb and is running 32-bit
> Windows / XP or Vista.
>
> I ran into R memory problem issue before. Please let me know your
> opinion according to your experience.
> Thanks a lot!
>
> - John
>
> ______________________________________________
> R-help@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide
>
http://www.R-project.org/posting-guide.html<http://www.r-project.org/posting-guide.html>
> and provide commented, minimal, self-contained, reproducible code.
>
--
Jim Holtman
Cincinnati, OH
+1 513 646 9390
What is the problem that you are trying to solve?
[[alternative HTML version deleted]]