4

Say my executable is c:\my irectory\myfile.exe and my R script calls on this executeable with system(myfile.exe)

The R script gives parameters to the executable programme which uses them to do numerical calculations. From the ouput of the executable, the R script then tests whether the parameters are good ore not. If they are not good, the parameters are changed and the executable rerun with updated parameters.

Now, as this executable carries out mathematical calculations and solutions may converge only slowly I wish to be able to kill the executable once it has takes to long to carry out the calculations (say 5 seconds)

How do I do this time dependant kill?

PS: My question is a little related to this one: (time non dependant kill) how to run an executable file and then later kill or terminate the same process with R in Windows

Community
  • 1
  • 1
Toby
  • 533
  • 4
  • 15

2 Answers2

4

You can add code to your R function which issued the executable call:

setTimeLimit(elapse=5, trans=T)

This will kill the calling function, returning control to the parent environment (which could well be a function as well). Then use the examples in the question you linked to for further work.

Alternatively, set up a loop which examines Sys.time and if the expected update to the parameter set has not taken place after 5 seconds, break the loop and issue the system kill command to terminate myfile.exe .

Carl Witthoft
  • 20,573
  • 9
  • 43
  • 73
  • This is a valid answer, as far as I believe, but brings up another problem: You have to wait for the time specified in `elapse = 5` until you can carry on with the next piece of code execution. Only after the specified elapsed time R cancels the connection with the executable calculation programm (and input/output) files. And you need this disconnection for repeated executions. I have started to try on the MS-DOS side now. – Toby Jul 15 '13 at 12:07
  • Add. Info: Just tried it with the following function from the R.utils package `evalWithTimeout(runHYDRUS(), timeout = 0.1, onTimeout= "warning")` I have the same problem, that I cannot disconnect from the executable programm. I.e. all input/output files are still "open". – Toby Jul 15 '13 at 12:25
  • @TobyElTejedor I thought you wanted to wait, so as to make sure you had valid updates before proceeding? What's the priority scheme you intend (i.e. R-code waiting for `myfile.exe` or not, or parallel R-tasks?). To your second comment: timeouts inside R only stop R-functions. You need to "observe" the timeout one way or another and issue a system command to kill the executable. – Carl Witthoft Jul 15 '13 at 12:54
  • Well exactly, the calculation programm is called newly after each iteration. You have to have a killed process before each new iteration of `myfile.exe`. But how? An example: If I have a calculation process which takes 5 seconds: I can have two cases in which one accepts a longer period and one a shorter period of calculation time with respect to 5seconds a) `setTimeLimit(elapse=10, trans=T)` b) `setTimeLimit(elapse=1, trans=T)` In the first case, all goes well. In the second case I then have to "taskkill" the `myfile.exe`, else I get an error on the next calculation. – Toby Jul 15 '13 at 13:03
2

There might possibly be nicer ways but it is a solution.

The assumption here is, that myfile.exe successfully does its calculation within 5 seconds

try.wtl <- function(timeout = 5)
{
      y <- evalWithTimeout(system(myfile.exe), timeout = timeout, onTimeout= "warning")
      if(inherits(y, "try-error")) NA else y 
}

case 1 (myfile.exe is closed after successfull calculation)

g <- try.wtl(5)

case 2 (myfile.exe is not closed after successfull calculation)

g <- try.wtl(0.1)

MSDOS taskkill required for case 2 to recommence from the beginnging

if (class(g) == "NULL") {system('taskkill /im "myfile.exe" /f',show.output.on.console = FALSE)}

PS: inspiration came from Time out an R command via something like try()

Community
  • 1
  • 1
Toby
  • 533
  • 4
  • 15