Hi - I also posted this on r-sig-ecology to little fanfare, so I'm trying here. I've recently hit an apparent R issue that I cannot resolve (or understand, actually). I am using the quantreg package (quantile regression) to fit a vector of quantiles to a dataset, approx 200-400 observations. To accommodate some autocorrelation issues, I have to assess significance with randomization. The problem is that I consistently observe what appears to be a memory problem causing an R crash. The problem occurs within a local function I am using to (i) randomize the data and (ii) run quantile regression on the randomized dataset. The crash only occurs (or so it seems) when I try send rq() [ = quantreg workhorse function ] a vector of quantiles to fit. Even when I use the same random number seed, the crash occurs on different iterations of the simulation. It sometimes occurs before rq() is called within the local function, and sometimes after rq() is called within the local function. Sometimes it occurs after returned to the main function. It does occur at approximately (but not necessarily) the same iteration, though. I cannot explain this. I consider this to be a fairly small dataset; others use this with many thousands of points. And why does this occur at roughly the same iteration every time? That would suggest that the memory issue is cumulative - shouldn't any memory consumed within rq(...) be freed up after I return??? This is occurring with R 2.10.1 on a 64 bit machine running OSX 10.6.2 (6 GB RAM). Thanks! ~Dan Rabosky [[alternative HTML version deleted]]