On Mon, 1 Aug 2011, Paul Rodriguez wrote:
> Hello R experts,
>
> I'm trying to test R in a shared memory environment in which
addressable memory is aggregrated to about 600G.
>
> However, I get an error of 'too many elements' specified when I try
creating a 45K x 100K matrix.
>
> I tried running R with a --max-nsize=50000000000 option, but got the same
message.
>
> Is there a way to run create such large matrices?
No. See ?"Memory-limits" (a matrix in R is also a vector).
NB: setting a limit on Ncells (there normally is not one) isn't going
to help allocation of vectors, is it?>
> thanks,
> Paul Rodriguez
--
Brian D. Ripley, ripley at stats.ox.ac.uk
Professor of Applied Statistics, http://www.stats.ox.ac.uk/~ripley/
University of Oxford, Tel: +44 1865 272861 (self)
1 South Parks Road, +44 1865 272866 (PA)
Oxford OX1 3TG, UK Fax: +44 1865 272595