Point 1 is as documented: you have exceeded the maximum integer and it
does say that it gives NA. So the only 'odd' is reporting that you
did not read the documentation.
Point 2 is R not using the correct units for --max-vsize (it used the
number of Vcells, as was once documented), and I have fixed.
But I do wonder why you are using --max-vsize: the documentation says
it is very rarely needed, and I suspect that there are better ways to
do this.
Also, you ignored the posting guide and did not tell us the 'at a
minimum' information requested: what OS was this, and was it a 32- or
64-bit R if a 64-bit OS?
I don't find reporting values of several GB as bytes very useful, but
then mem.limits() is not useful to me either ....
On Thu, 21 Jul 2011, Christophe Rhodes wrote:
> Hi,
>
> In both R 2.13 and the SVN trunk, I observe odd behaviour with the
> --max-vsize command-line argument:
>
> 1. passing a largeish value (about 260M or greater) makes mem.limits()
> report NA for the vsize limit; gc() continues to report a value...
>
> 2. ...but that value (and the actual limit) is wrong by a factor of 8.
>
> I attach a patch for issue 2, lightly tested. I believe that fixing
> issue 1 involves changing the return convention of do_memlimits -- not
> returning a specialized integer vector, but a more general numeric; I
> wasn't confident to do that.
>
>
--
Brian D. Ripley, ripley at stats.ox.ac.uk
Professor of Applied Statistics, http://www.stats.ox.ac.uk/~ripley/
University of Oxford, Tel: +44 1865 272861 (self)
1 South Parks Road, +44 1865 272866 (PA)
Oxford OX1 3TG, UK Fax: +44 1865 272595