Ramiro
I think the problem is the loop - R doesn't release memory allocated inside
an expression until the expression completes. A for loop is an expression,
so it duplicates fit and dataset on every iteration. An alternative
approach that I have found successful in similar circumstances is to use
sapply(), like this
fits <- list()
sapply(1:N,function(i){
dataset <- generateDataset(i)
fit[[i]] <- try( memoryHogFunction(dataset, otherParameters))
})
I'm assuming above that you want to save the result of memoryHogFunction
from each iteration.
hth
Drew
On Thu, Apr 5, 2012 at 8:35 AM, Ramiro Barrantes <
ramiro@precisionbioassay.com> wrote:
> Dear list,
>
> I am trying to reclaim what I think is lost memory in R, I have been using
> gc(), rm() and also using Rprof to figure out where all the memory is going
> but I might be missing something.
>
> I have the following situation
>
> basic loop which calls memoryHogFunction:
>
> for i in (1:N) {
> dataset <- generateDataset(i)
> fit <- try( memoryHogFunction(dataset, otherParameters))
> }
>
> and within
>
> memoryHogFunction <- function(dataset, params){
>
> fit <- try(nlme(someinitialValues)
> ...
> fit <- try(updatenlme(otherInitialValues)
> ...
> fit <- try(updatenlme(otherInitialValues)
> ...
> ret <- fit ( and other things)
> return a result "ret"
> }
>
> The problem is that, memoryHogFunction uses a lot of memory, and at the
> end returns a result (which is not big) but the memory used by the
> computation seems to be still occupied. The original loop continues, but
> the memory used by the program grows and grows after each call to
> memoryHogFunction.
>
> I have been trying to do gc() after each run in the loop, and have even
> done:
>
> in memoryHogFunction()
> ...
> ret <- fit ( and other things)
> rm(list=ls()[-match("ret",ls())])
> return a result "ret"
> }
>
> ???
>
> A typical results from gc() after each loop iteration says:
> used (Mb) gc trigger (Mb) max used (Mb)
> Ncells 326953 17.5 597831 32.0 597831 32.0
> Vcells 1645892 12.6 3048985 23.3 3048985 23.3
>
> Which doesn't reflect that 340mb (and 400+mb in virtual memory) that
are
> being used right now.
>
> Even when I do:
>
> print(sapply(ls(all.names=TRUE), function(x) object.size(get(x))))
>
> the largest object is 8179808, which is what it should be.
>
> THe only thing that looked suspicious was the following within Rprof (with
> memory=stats option), the tot.duplications might be a problem???:
>
> index: "with":"with.default"
> vsize.small max.vsize.small vsize.large max.vsize.large
> 30841 63378 20642 660787
> nodes max.nodes duplications tot.duplications
> 3446132 8115016 12395 61431787
> samples
> 4956
>
> Any suggestions? Is it something about the use of loops in R? Is it
> maybe the try's???
>
> Thanks in advance for any help,
>
> Ramiro
>
> [[alternative HTML version deleted]]
>
> ______________________________________________
> R-help@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide
> http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.
>
--
Drew Tyre
School of Natural Resources
University of Nebraska-Lincoln
416 Hardin Hall, East Campus
3310 Holdrege Street
Lincoln, NE 68583-0974
phone: +1 402 472 4054
fax: +1 402 472 2946
email: atyre2@unl.edu
http://snr.unl.edu/tyre
http://aminpractice.blogspot.com
http://www.flickr.com/photos/atiretoo
[[alternative HTML version deleted]]