Please read ?"Memory-limits" and the R-admin manual for basic
information.
On Thu, 5 Feb 2009, Tom Quarendon wrote:
> I have a general question about R's usage or memory and what limits
exist on
> the size of datasets it can deal with.
> My understanding was that all object in a session are held in memory. This
> implies that you're limited in the size of datasets that you can
process by
> the amount of memory you've got access to (be it physical or paging).
Is this
> true? Or does R store objects on disk and page them in as parts are needed
in
> the way that SAS does?
That's rather a false dichotomy: paging uses the disk, so the
distinction is if R implemented its own virtual memory system or uses
the OS's one (the latter).
There are also interfaces to DBMSs for use with large datasets: see
the R-data manual and also look at the package list in the FAQ.
> Are there 64 bit versions of R that can therefore deal with much larger
> objects?
Yes, there have been 64-bit versions of R for many years, and they are
in routine use on very large problems.
> Many thanks.
>
> ______________________________________________
> R-help at r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide
http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.
>
--
Brian D. Ripley, ripley at stats.ox.ac.uk
Professor of Applied Statistics, http://www.stats.ox.ac.uk/~ripley/
University of Oxford, Tel: +44 1865 272861 (self)
1 South Parks Road, +44 1865 272866 (PA)
Oxford OX1 3TG, UK Fax: +44 1865 272595