Hello Hui,
On Tue, Jun 12, 2012 at 2:12 AM, Hui Wang <huiwang.biostats at gmail.com>
wrote:> Dear all,
>
> I've run into a question of handling large matrices in R. I'd like
to
> define a 70000*70000 matrix in R on Platform:
> x86_64-apple-darwin9.8.0/x86_64 (64-bit), but it seems to run out of memory
> to handle this. Is it due to R memory limiting size or RAM of my laptop? If
> I use a cluster with larger RAM, will that be able to handle this large
> matrix in R? Thanks much!
Do you really mean 7e4 by 7e4? That would be 4.9e9 entries. If each
entry takes 8 bytes (as it typically would on a 64 bit system), you
would need close to 40 Gigabyte storage for this matrix. I'm not sure
there is a laptop on the market with that amount of RAM.
What do you need such a large matrix for? If most of the elements
are zero, you don't want a regular matrix to hold the data, but use
some sort of sparse matrix implementation.
Take care
Oliver
--
Oliver Ruebenacker
Bioinformatics Consultant (http://www.knowomics.com/wiki/Oliver_Ruebenacker)
Knowomics, The Bioinformatics Network (http://www.knowomics.com)
SBPAX: Turning Bio Knowledge into Math Models (http://www.sbpax.org)