Hi listers,
In light with the recent discussion on the optimizing the use of memory in
straneous proceudres i present you m problem, and hope to some additional
ideas.
I'm running a simualtion that in each step uses quite an amount of memory
(but not exceedingly) - just to give you an idea - I create a pseudo
population (n=1000, m=3) run lme and lm model and multiply impute (M=5) and
do the pooling and final staistics calculation. Each simulation has 1000
cycles. I repeat the simulation a number of times for a set of different
initial conditions. If I run them in a batch, say 100 initial conditions,
the whole procedure tends to run slower with each new additional condition.
I must tell you that not much of information is stored, just pooled results
over 1000 cycles and final calcualtions for each initial set.
I avoided FOR loops as musch as I could... I have just two: one for the
cycles and one for the initial conditions, everything else is vectorized.
What could I do to improve the procedure?
I am thinking to additionaly vectorize the last for loop (the initial
conditions)... would it make a difference?
(1)>for (i in 1:100)
sim[[i]]<-Simulation.Run(r[[i]],m1,m2,n1,n2,mdm,n.iter=1000)
## that is how it was done so far
(2)
for (i in 65:100)
{
sim[[i]]<-Simulation.Run(r[[i]],m1,m2,n1,n2,mdm,n.iter=1000)
gc()
}
## this is what I would try based on some suggestion for emptying the memory
(3)>Simulation.Run(r[[1:100]],m1,m2,n1,n2,mdm,n.iter=1000)
## this is the vectorized version, but without the gc()??
>From a source I got a suggestion for a recursion. Does anybody have an
experiance with recursion? I haven't done much with it. Would it speed up
the process?
My problem is grave since one set of initial conditions tends to run for 20
min on a R1.8.0 Win XP, P 2,4GHz, 512MbRam.
Thanks for the suggestions...
Andrej
_________
Andrej Kveder, M.A.
researcher
Institute of Medical Sciences SRS SASA; Novi trg 2, SI-1000 Ljubljana,
Slovenia
phone: +386 1 47 06 440 fax: +386 1 42 61 493
[[alternative HTML version deleted]]