The recommended technique is to create objects
as their final size and then subscript into them
with your data.
My intuition (which is often brutally wrong) tells
me that your case should not be overly traumatic.
So I'm suspicious that you are fragmenting memory
in other ways as well.
Patrick Burns
patrick at burns-stat.com
+44 (0)20 8525 0696
http://www.burns-stat.com
(home of S Poetry and "A Guide for the Unwilling S User")
erwann rogard wrote:> hello,
>
> i have something like:
>
> out<-list()
>
> for(i in 1:n){
> data<-gen(...) #fixed size data
> out[[i]]<- fun(data)
> }
>
>
>> object.size(out[[1]])
>>
> 6824
>
> In principle 1 GB should allow
>
> n = 1024^3/6824 = 157347?
>
> i have about 2GB are not taken by other processes. however, I can see the
> memory shrinking quite rapidly on my system monitor and have to stop the
> simulation after only n=300. why such a discrepancy? any remedy?
>
> x86_64-pc-linux/RKWard/R2.8.0/ 4GB
>
> thanks.
>
> [[alternative HTML version deleted]]
>
> ______________________________________________
> R-help at r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide
http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.
>
>
>