I have long time running tasks in my live server. Tasks are involving in getting data from facebook and generating the PDF with reportlab PDF package.
For these I have 3 workers with concurrency level of 5, so that I can able to execute 30 PDF tasks in parallel.
But When 10 tasks are running at a time, a long time running task will breaks the other tasks to expire their task hard time limit(12 hours).
But In my server a single PDF task will took maximum 3 hours or 4 hrs in worst case. But When I am running all workers with concurrency level 5, some of the tasks are getting success and some of the tasks time limit exceeds(12 hours). But My target is need to complete all these 10 tasks within 4 or 5 hours.
Is any best way available to handle long time running tasks?
Also I am using django - celery package.
My celery Conf:
CELERYD_OPTS="-time-limit=43200 --concurrency=10"
CELERYD_CONCURRENCY = 10
CELERYD_NODES = "worker1 worker2 worker3"
Running workers: python manage.py celeryd_multi restart n1 n2 n3 -l info -f celery.log -c 10 --purge -Q:n1,n2,n3 backend