0

Situation: I've got an php script which runs a lot of requests to the Google Pagespeed API. These requests take a while. I've programmed a script which run perfectly while I'm in my local development enviorement (no max_execution_time is set). Well, I want to run the script on production server with no editable max_execution_time (30s). How can I archive this without - is it really so hard to implement such a function?

I'm glad if you give me some ideas with - maybe - a little bit of code.

  • 2
    Try to split the task to smaller chunks, eg: if you have 100 queries to execute, change the script to execute one query and then add self refresh header to it so that it can continue the next query in another session. – Capital C Sep 06 '16 at 16:50
  • Isn't it better to use Ajax for something like that? – Jan Andrè Schlösser Sep 06 '16 at 18:10
  • @JanAndrèSchlösser Why do you propose that you should use AJAX? Is there going to be a user-interface or is this just a long-running task which you plan to run once a day? A scheduled CLI task is the BEST choice; are you allowed to set up a scheduled task on your web server? – MonkeyZeus Sep 06 '16 at 18:47
  • Actually There is a User Interface, which should Show the Progress of the Running Script... Does anyone has a idea how to implement such a Ajax request? Maybe a Code Sample? thanks! – Jan Andrè Schlösser Sep 06 '16 at 20:32

1 Answers1

0

In the absence of any code, I'll assume you are using curl of file_get_contents, so you could use curl_multi_* functions which allows you to send several http request at the same time and wait for their executions to end.

I used it several time and it greatly improved the execution time.

Hope this helps.

vincenth
  • 1,732
  • 5
  • 19
  • 27