I'm looking for a queuing system that could support the following scenario:
- A client adds a job - to check how many Facebook likes a particular url (URL1) has;
- A client adds another job - to check the same information for URL2;
[....]
A worker picks up anything from 1 to 50 jobs (urls) from the queue (e.g., if there's only 5 - it picks up 5, if there's 60 - picks up 50, leaving others for another worker), and issues a request against Facebook API (which allows multiple urls per request). If it succeeds, all jobs are taken out of the queue, if it fails - all of them stay.
I'm working with PHP and I've looked into Gearman, Beanstalkd, but did not find any similar functionality. Is there any (free) queuing system that would support such a "batch-dequeuing"?
Or, maybe, anybody could suggest an alternative way of dealing with such an issue? I've considered keeping a list of "to check" urls outside the queuing system and then adding them in bundles of max N items with a cron job that runs every X period. But that's kind of building your own queue, which defeats the whole purpose, doesn't it?