0

I have a php script that does a get request to an api, and with the results has to fetch over 1000 images.

My problem is that in the process of doing this I always get a timeout.

What should be my best approach for this?

I've tried to do a loop with dividing the response in smaller gets of the files but still getting the timeout.

Should I store the progress, and do even smaller batches?

1 Answers1

0

PHP has a configuration called max_execution_time which you should take a look at.

If you want to divide the processing into separate small processes try using a queue system like RabbitMQ. your main process would send the tasks to RabbitMQ and then you would have one or multiple workers listen to the queue and get each item to process.

solarc
  • 5,638
  • 2
  • 40
  • 51
  • I did check the max_execution_time, problem is that in my localmachine i have control over that, on the production server i'm not allowed to change that. Gonna check RabbitMQ, thanks. – José Moreira Jul 12 '18 at 08:40