As a part of a simulation, I need to sample from a large vector repeatedly. For some reason sample() builds up the memory usage (> 500 MB for this example) when used inside a for loop as illustrated here: X <- 1:100000 P <- runif(100000) for(i in 1:500) Xsamp <- sample(X,30000,replace=TRUE,prob=P) Even worse, I am not able to free up memory without quitting R. I quickly run out of memory when trying to perform the simulation. Is there any way to avoid this to happen? The problem seem to appear only when specifying replace=TRUE and probability weights for the vector being sampled, and this happens both on Windows XP and Linux (Ubuntu). Victor
As a part of a simulation, I need to sample from a large vector repeatedly. For some reason sample() builds up the memory usage (> 500 MB for this example) when used inside a for loop as illustrated here: X <- 1:100000 P <- runif(100000) for(i in 1:500) Xsamp <- sample(X,30000,replace=TRUE,prob=P) Even worse, I am not able to free up memory without quitting R. I quickly run out of memory when trying to perform the simulation. Is there any way to avoid this to happen? The problem seem to appear only when specifying replace=TRUE and probability weights for the vector being sampled, and this happens both on Windows XP and Linux (Ubuntu). Victor