On Mon, Oct 08, 2001 at 06:13:05PM -0600, Marcus G. Daniels
wrote:> By my reading, R has a hard limit on INT_MAX bytes for vector
> allocations. On the sparcv9 architecture, INT_MAX is a 32 bit
> quantity, even though pointers can be 64 bits. Has any thought been
> given to use of > 2GB of virtual memory on systems like this?
>
Thought, yes; action, no :-)
I think we're stuck for several reasons with int for the internal R
integer (FORTRAN compatibility is one reason I believe), so there will
always be a 2G limit on the element count for individual vectors. But
it should be possible to allow for total memory use to match the word
size. Unfortunately in writing the GC we used int in too many places
and a fairly careful review and rewrite is needed to replace the
appropriate int declarations (and no others) by size_t or ssize_t or
something analogous. There may be a few other places that are
affected but the stuff in memory.c should be the bulk of it. It's on
my to do list, but not terribly high at the moment.
luke
--
Luke Tierney
University of Minnesota Phone: 612-625-7843
School of Statistics Fax: 612-624-8868
313 Ford Hall, 224 Church St. S.E. email: luke at stat.umn.edu
Minneapolis, MN 55455 USA WWW: http://www.stat.umn.edu
-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-
r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html
Send "info", "help", or "[un]subscribe"
(in the "body", not the subject !) To: r-help-request at
stat.math.ethz.ch
_._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._