Displaying 1 result from an estimated 1 matches for "665mb".
Did you mean:
65mb
2004 Mar 29
3
data usage
hello,
for my present project i need to use the data stored in a ca. 100mb
stata dataset.
when i import the data in R using:
library("foreign")
x<-read.dta("mydata.dta")
i find that R needs a startling 665mb of memory!
(in stata i can simply allocate, say, 128mb of memory and go ahead)
is there anyway around this, or should i forget R for analysis of
datasets of this magnitude?
thanks for you help in this, edwin.