For different celery job groups, how can we limit the number of workers asynchronously differently.
Here is a good reference on how to group different celery tasks together : https://sayari3.com/articles/18-chains-groups-and-chords-in-celery/
I need to do this dynamically in script are there any ways to do this? For example, I want to limit 10 parallel workers for job1 and 5 parallel workers for job2 in following script.
#tasks.py
from celery import group
@app.task
def task1(param):
# ...
@app.task
def task2(param):
# ...
job1 = group([task1.s(param) for param in params])
job2 = group([task2.s(param) for param in params])
job1.apply_async() # limit 10 workers
job2.apply_async() # limit 5 workers