Displaying 1 result from an estimated 1 matches for "600x700000".
Did you mean:
60000000
2009 Sep 23
1
read.delim very slow in reading files with lots of columns
Hi,
I am trying to read a tab-delimited file into R (Ver. 2.8). The machine I am using is 64bit Linux with 16 GB.
The file is basically a matrix(~600x700000) and as large as 3GB.
The read.delim() ran extremely slow (hours) even with a subset of the file (31 MB with 6x700000)
I monitored the memory usage, and found it constantly only took less than 1% of 16GB memory.
Does read.delim() have difficulty to read files with lots of columns?
Any sugges...