Lukas Rode
2008-Apr-02 15:06 UTC
[R] Stopping a function execution automatically after a given time
Dear all, I often need to execute functions repeatedly (thousands or more times). While doing so, I encounter two types of problems: 1.) In some models, the estimation process fails due to convergence problems 2.) Some models will run forever. My solution to #1 is to use tryCatch around the function call so that my script does not stop if one of the models will raise an error. This works fine. Nowever, with regard to #2, I am lost. I would like to set a maximum time limit (say, 1 minute) and if my procedure is still running then, I would like to move on to the next model. I guess a brute-force solution would be to start a new R process for each model from my R script and to kill the process using system commands after a given time. However, restarting the R interpreter each time sounds very inelegant. Are there any versions that work within a single script? Or alternative suggestions? Note that mostly these functions are not written by me and not R code (like nlme for example), so it is not feasible to adapt the function itself. Rather, it needs to be a wrapper around the function, similar to tryCatch. Any help appreciated! Kind regards, Lukas [[alternative HTML version deleted]]
Bert Gunter
2008-Apr-02 15:16 UTC
[R] Stopping a function execution automatically after a given time
?help.search("time")
?proc.time
?system.time
You can stick these in your code appropriately to keep track of elapsed
time. You could also count iterations, of course.
-- Bert Gunter
Genentech
-----Original Message-----
From: r-help-bounces at r-project.org [mailto:r-help-bounces at r-project.org]
On
Behalf Of Lukas Rode
Sent: Wednesday, April 02, 2008 8:07 AM
To: r-help at stat.math.ethz.ch
Subject: [R] Stopping a function execution automatically after a given time
Dear all,
I often need to execute functions repeatedly (thousands or more times).
While doing so, I encounter two types of problems:
1.) In some models, the estimation process fails due to convergence problems
2.) Some models will run forever.
My solution to #1 is to use tryCatch around the function call so that my
script does not stop if one of the models will raise an error. This works
fine.
Nowever, with regard to #2, I am lost. I would like to set a maximum time
limit (say, 1 minute) and if my procedure is still running then, I would
like to move on to the next model.
I guess a brute-force solution would be to start a new R process for each
model from my R script and to kill the process using system commands after a
given time. However, restarting the R interpreter each time sounds very
inelegant. Are there any versions that work within a single script? Or
alternative suggestions?
Note that mostly these functions are not written by me and not R code (like
nlme for example), so it is not feasible to adapt the function itself.
Rather, it needs to be a wrapper around the function, similar to tryCatch.
Any help appreciated!
Kind regards,
Lukas
[[alternative HTML version deleted]]
______________________________________________
R-help at r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.
mel
2008-Apr-02 15:23 UTC
[R] Stopping a function execution automatically after a given time
Lukas Rode a ?crit :> Nowever, with regard to #2, I am lost. I would like to set a maximum time > limit (say, 1 minute) and if my procedure is still running then, I would > like to move on to the next model.begin_time = as.difftime(format(Sys.time(), '%H:%M:%S'), units='secs'); for(...) { ... current_time = as.difftime(format(Sys.time(), '%H:%M:%S'), units='secs'); delay = current_time - begin_time; if (delay>60) return(); } a counter may also be enough