0

Currently I'm using multi_curl to execute php files simultaneously on my own server. It seems that it is not a very efficient way as the server gets overloaded when 200+ scripts need to be executed at the same time. I need to send variables to each php script which I do currently by putting all the required variables in the post request.

I know that another option would be gearman, but I'd have to install that, and I prefer not to install too many applications although I have no idea wheter it is a heavy program or not. I also don't know if it would more efficient for the server.

Would there also be an option to use the linux shell command within php?

I have a standard configured linux server with php and mysql.

So my question is:

What would be the most efficient way to execute many php scripts simultaneously that also allows me to send a set of variables to each script?

BastiaanWW
  • 1,259
  • 4
  • 18
  • 34

1 Answers1

0

Using gearman is a good way to be honest... used it many times, to reliably marshal multiple threads in a queue.

Using German you can then send (via some database table is usually the best approach) data that you're threads can then get and use. Hint, you can store it as JSON or something if it's a complex and arbitrary set of variables.

Another approach is write you're own queuing mechanism ... I have also done this based on this excellent article: http://squirrelshaterobots.com/programming/php/building-a-queue-server-in-php-part-4-run-as-a-background-daemon-a-k-a-forking/

Brian
  • 8,418
  • 2
  • 25
  • 32
  • Not Gearman directly... Store a JSON string in DB, then the executed scripts can retrieve and parse it back to an array of variables/objects. – Brian Aug 07 '12 at 15:57
  • Bah :) typing too quickly as usual - that's what we get for answering on SO in-between working :) – Brian Aug 07 '12 at 15:59