You haven't even told us your OS (see the posting guide).
But the usual way is to get your OS to set a memory limit for a
process (usually via your shell), and to run things under
try/tryCatch. Then the OS will stop R allocating more than the limit,
the current task in R will fail, and the loop can move on to the next.
I would just caution that these OS facilities do not always work as
advertised. E.g. the current man pages on Fedora 16 are not actually
up-to-date.
On Sun, 1 Apr 2012, Ramiro Barrantes wrote:
> Hello,
>
> I have a general question on the possibility of how to "catch and
stop" a function when it uses too much memory.
>
> The problem is that some datasets, when applied to nlme (a relatively older
version), cause the nlme function to just hang forever and start taking over
memory (this afternoon one of those calls was about 40GB!) and not returning an
answer. Other datasets work fine.
>
> I am trying to debug nlme by varying its parameters but I have a general
question in the interim. I have the following situation:
>
> for i in (1:N) {
> dataset <- createDataset(i)
> try(nlme(dataset, otherParameters))
> }
>
> If one of those datasets starts using, say more than 2GB of memory I would
like to just stop nlme, get an error, record it, and move on with the next
dataset. Right now with some datasets nlme takes over the computer memory and
the system ends up killing the entire process.
>
> Any suggestions appreciated.
>
> Thank you,
>
> Ramiro
>
> [[alternative HTML version deleted]]
>
> ______________________________________________
> R-help at r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide
http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.
>
--
Brian D. Ripley, ripley at stats.ox.ac.uk
Professor of Applied Statistics, http://www.stats.ox.ac.uk/~ripley/
University of Oxford, Tel: +44 1865 272861 (self)
1 South Parks Road, +44 1865 272866 (PA)
Oxford OX1 3TG, UK Fax: +44 1865 272595