What do you think is `slightly inconsistent behavior' here? (You seem to
be quite consistent in not telling us such relevant facts, including your
OS and version of R!)
If you think that the memory usage of R should be monotone in the size of
the problem, your expectations are unfounded. Here it is likely you are
seeing memory fragmentation on a 32-bit OS: see help("Memory-limits").
But setting nrows=99999999 and 'extend nrows by a few more 9's'
seems to
me suggesting something like nrows=9999999999 which is invalid and gives
a warning message you did not mention.
On Tue, 26 Dec 2006, ivo welch wrote:
> dear R experts:
>
> This is just a minor, minor nuisance, but I thought I would point it out:
>
>> dataset <- read.table(file=pipe(cmdline), header =T,
> + na.strings=c("NaN",
"C","I","M", "E"), sep=",",
> as.is=T, nrows=99999999);
> Error: cannot allocate vector of size 781249 Kb
>
> If I extend nrows by a few more 9's, the error goes away. Similarly,
> if I use much fewer observations, the error goes away.
>
> regards,
>
> /iaw
>
> ______________________________________________
> R-help at stat.math.ethz.ch mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide
http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.
PLEASE do, at long last.
--
Brian D. Ripley, ripley at stats.ox.ac.uk
Professor of Applied Statistics, http://www.stats.ox.ac.uk/~ripley/
University of Oxford, Tel: +44 1865 272861 (self)
1 South Parks Road, +44 1865 272866 (PA)
Oxford OX1 3TG, UK Fax: +44 1865 272595