0

For example: I have a task named "URLDownload", the task's function is download a large file from internet. Now I have a Worker Process running, but have about 1000 files to download. It is easy for a Client Process to create 1000 task, and send them to Gearman Server.

My Question is the Worker Process will do the task one by one, or it will accept multi-tasks at one time,

If the Worker Process can accept multi-tasks, So How can I limit the task-pool-size in Worker Process.

truease.com
  • 1,261
  • 2
  • 17
  • 30
  • Take a look into my question (very similar) - basically, it looks like there's no built-in way to do it, but you can achieve it with beanstalkd, if needed. http://stackoverflow.com/questions/9550574/a-queuing-system-which-supports-job-batching-e-g-several-jobs-for-1-worker-at – Aurimas Apr 13 '12 at 11:06

1 Answers1

2

Workers process one request at a time. You have a few options:

1) You can run multiple workers (this is the most common method). Workers sit in poll() when they aren't processing so this model works pretty well.

2) Write a fork() implementation around the worker. This way you can fire up a set number of worker processes, but don't have to monitor multiple processes.

halfer
  • 19,824
  • 17
  • 99
  • 186
Brian Aker
  • 81
  • 3