On Fri, 14 Oct 2005, Wensui Liu wrote:
> Dear useRs,
>
> I am wondering what is the most fast and stable way to read data (pretty
> large) into R. Right now, the methods I could think of are:
> 1) use read.table to read *.csv or *txt
> 2) use RODBC to read excel or access
>
> But I don't know which is better.
Depends on the data and how large is `pretty large'.
If your data are numeric, 2) will be faster as you will avoid
numeric->character->numeric conversions. If your data are to be factors,
1) might be as fast.
However, both are pretty much instantaneous unless you have millions
of items to read (and Excel is unlikely to cope well with such numbers).
I just tested reading 1 million numbers from an 18Mb file in 3s with
read.table().
--
Brian D. Ripley, ripley at stats.ox.ac.uk
Professor of Applied Statistics, http://www.stats.ox.ac.uk/~ripley/
University of Oxford, Tel: +44 1865 272861 (self)
1 South Parks Road, +44 1865 272866 (PA)
Oxford OX1 3TG, UK Fax: +44 1865 272595