Dear R experts,
I have been trying to run an iterative procedure in R and am having
some sort of memory build up problem. I am using R1.8.0 on Windows XP.
One single iteration of my procedure is coded into a function. This
function creates an extremely large matrix of simulated values (actually
calls WinBugs which returns simulations), does some calculations with it
and returns a single number as a result. After this one step I no longer
need this large matrix but it seems to be stored in memory anyhow. The
code is something like:
parameter <- 0 # initial value
for (i in 1:1000)
{
parameter <- one.step(parameter, data)
mem <- memory.size()
cat(parameter," ", mem,"\n")
}
I output the memory.size() at each iteration and this grows and grows
until I run out of memory and get an allocation error. When this happens,
I record the last parameter value, quit R, start R again and rerun the
procedure starting with this most recent value. I'd rather not do it this
way! I have increased the memory limit using the memory.limit() function
and this helps a bit.
My Questions:
1. Is there any way to free the memory after each iteration since
I really don't need anything other than the most recent parameter value?
2. If I run this code on the same machine but using the Linux OS will i
have the same problem?
3. Would I be able to avoid this problem if I ran the loop in some other
language like Perl or C and called the R function to do one iteration at
I have noticed several postings about this sort of thing in the archives
but I'm still a bit unclear. Any help is greatly appreciated.
Thanks,
Farouk
Farouk Nathoo <nathoo at cs.sfu.ca> writes:> parameter <- 0 # initial value > for (i in 1:1000) > { > parameter <- one.step(parameter, data) > mem <- memory.size() > cat(parameter," ", mem,"\n") > } > > > I output the memory.size() at each iteration and this grows and grows > until I run out of memory and get an allocation error. When this happens, > I record the last parameter value, quit R, start R again and rerun the > procedure starting with this most recent value. I'd rather not do it this > way! I have increased the memory limit using the memory.limit() function > and this helps a bit. > > My Questions: > > 1. Is there any way to free the memory after each iteration since > I really don't need anything other than the most recent parameter value?That generally shouldn't be necessary, unless you're doing something in one.step() that causes R objects to hang around after each iteration. A typical mistake is to have an attach() inside the loop and end up with 1000 copies of the entire data set on the search path...> 2. If I run this code on the same machine but using the Linux OS will i > have the same problem?Most likely, yes.> 3. Would I be able to avoid this problem if I ran the loop in some other > language like Perl or C and called the R function to do one iteration atProbably not. -- O__ ---- Peter Dalgaard Blegdamsvej 3 c/ /'_ --- Dept. of Biostatistics 2200 Cph. N (*) \(*) -- University of Copenhagen Denmark Ph: (+45) 35327918 ~~~~~~~~~~ - (p.dalgaard at biostat.ku.dk) FAX: (+45) 35327907