1

I'm building a website for my office work which loads dynamically tables from an long PHP script. To do that, I've created a PHP file that handle AJAX and Get request (echo JSON from long PHP script).

When i use my code in local machine with a WAMP server all works good, of course I had to add this line ini_set('max_execution_time', 5000); to avoid the issue of timeout.

My problem is when i put my project on real Virtual Machine with FreeBSD environment that doesn't work, i mean after 600 sec the server kill my request (Fatal error: Uncaught exception 'HttpException' with message 'An error occurred : [503]).

But I note that if I run the PHP file directly from the console in command line all works good and the script echo my JSON request.

The detail of the script is essentially from for each and ended by echo JSON of processing loops.

So my question is how i can manage it ?

I was thinking about do something like that :

  • call the file script.php from ajax in command line
  • split the file script.php to avoid the timeout

Below, here is a scheme of the current process : process

Peacefull
  • 546
  • 6
  • 24
  • Is s user going to wait this long? –  Nov 09 '16 at 10:13
  • try this: http://stackoverflow.com/questions/9629566/how-to-increase-apache-timeout-directive-in-htaccess – Gaurav Srivastava Nov 09 '16 at 10:14
  • @Dagon Yes the user will wait the final result because is a website for my office work. @GauravSrivastava i already tried `max_execution_time` without success but i will test the other method for apache, thanks. – Peacefull Nov 09 '16 at 11:48
  • @GauravSrivastava well, i tried to add Timeout to the httpd.conf file but the result is the same :\ – Peacefull Nov 09 '16 at 13:19

0 Answers0