0

I have a question which is related to lots of others, but has some key differences. I have a PHP script which needs to do some general work (delete a tuple from a database) every 15 minutes (900 seconds). This needs to be done over and over. So, I used set_time_limit(0) to keep PHP from timing out. The application works perfectly for the first iteration, then gets knocked out because the server does not let a script run for more than 1000 seconds. I conceived of a workaround, but am not sure how to actually write it. I need a script to do very little other than sleep for 900 seconds - that's not problem. Then I need it to start the work program and the sleep program needs to end. The work program will do its thing (a few seconds). Then it needs to start the sleep program again and the work program needs to end. That way, no script is running for more than 1000 seconds (in one iteration). I've tried using includes, headers, and some other stuff, but so far am having no luck. Given how important this app is to my job, I'd really appreciate assistance - Any ideas?

2 Answers2

0

Did you know that you can run your PHP script from the command line without a browser waiting for a page? If you have that script run from there instead of as CGI, it wouldn't be under that 1000 second limit. And it might even be able to be called to do its work by cron every 15 mins.

Dan D.
  • 73,243
  • 15
  • 104
  • 123
0

Alright - figured it out myself. Here is a sample code:

Chaintest:

<html>
<body onload="form1.submit()">
<form id="form1" action="chaintest2.php" method="post">
</form>
<?php
print"Begun";
sleep(5);
?>
</body>
</html>

Chaintest2:

<html>
<body onload="form1.submit()">
<form id="form1" action="chaintest.php" method="post">
</form>
<?php
print"PHP excecuted";
sleep(5);
?>

This code will just keep bouncing back and forth (printing out begun and PHP executed). It will never time out either through PHP or the server.