See help("gc") and read up about garbage collection. Memory
measurements
without a preceding gc() are meaningless.
You have not told us your version of R, but 2.1.0 and later do use much
less memory in write.table. Nevertheless, they do have to convert each
column to character and that does take some memory.
On Fri, 22 Jul 2005, Claude Messiaen - Urc Necker wrote:
> Hi R Users,
> After some research I haven't find what I want.
> I'm manipulating a dataframe with 70k rows and 30 variables, and I run
out of memory when exporting this in a *.txt file
>
> after some computing I have used :
>
>> memory.size()/1048576.0
> [1] 103.7730
>
> and I make my export :
>
>>
write.table(cox,"d:/tablefinal2.txt",row.names=F,sep=';')
>> memory.size()/1048576.0
> [1] 241.9730
>
> I'm surprised so I try removing some objects :
>> rm (trait,tany,tnor,toth,suivauxdany,dnor,doth,mod1,
> mod2,mod3,lok1,lok2,lok3,aux,risque,risk)
> and check memory space :
>> memory.size()/1048576.0
> [1] 242.1095
>
> First, I don't understand why when removing objects the memory used
increase ?
> Next, why the memory used double when I make an export ?
> I look forward to your reply
>
> Claude
>
> [[alternative HTML version deleted]]
>
> ______________________________________________
> R-help at stat.math.ethz.ch mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide!
http://www.R-project.org/posting-guide.html
>
--
Brian D. Ripley, ripley at stats.ox.ac.uk
Professor of Applied Statistics, http://www.stats.ox.ac.uk/~ripley/
University of Oxford, Tel: +44 1865 272861 (self)
1 South Parks Road, +44 1865 272866 (PA)
Oxford OX1 3TG, UK Fax: +44 1865 272595