That will be R 2.10.1 if I'm correct.
For reading in csv files, there's a function read.csv who does just that:
los <- read.csv("file.csv",header=T)
But that is just a detail. You have problems with your memory, but
that's not caused by the size of your dataframe. On my system, a
matrix with 100,000 rows and 75 columns takes only 28 Mb. So I guess
your workspace is cluttered with other stuff.
Check following help pages :
?Memory
?memory.size
?Memory.limits
it generally doesn't make a difference, but sometimes using gc() can
set some memory free again.
If none of this information helps, please provide us with a bit more
info regarding your system and the content of your current workspace.
Cheers
Joris
On Tue, Jun 8, 2010 at 8:46 AM, dhanush <dhana.sa at gmail.com>
wrote:>
> I tried to read a CSV file in R. The file has about 100,000 records and 75
> columns. When used read.delim, I got this error. I am using R ver 10.1.
>
>> los<-read.delim("file.csv",header=T,sep=",")
> Warning message:
> In scan(file, what, nmax, sep, dec, quote, skip, nlines, na.strings, ?:
> ?Reached total allocation of 1535Mb: see help(memory.size)
>
> Thanks
> --
> View this message in context:
http://r.789695.n4.nabble.com/how-to-read-CSV-file-in-R-tp2246930p2246930.html
> Sent from the R help mailing list archive at Nabble.com.
>
> ______________________________________________
> R-help at r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide
http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.
>
--
Joris Meys
Statistical consultant
Ghent University
Faculty of Bioscience Engineering
Department of Applied mathematics, biometrics and process control
tel : +32 9 264 59 87
Joris.Meys at Ugent.be
-------------------------------
Disclaimer : http://helpdesk.ugent.be/e-maildisclaimer.php