I am having serious problems with task publish speed and I am currently working on debugging it, I think it's problem with Celery settings.
I'm using Django, Celery and RabbitMQ. I have a simple task that sends emails. This task works, but it's very slow.
For example I want to execute 10000 simple 'test_print() tasks' in a loop I can't publish more than 23-24/s. Before it would easily publish 1000+/s. I am having this problem with completely EVERY task I have in the system. It might have happened since I moved my code to Djcelery (wrapped celery project with Django code) or something changed on the server (less possible option). Here are my settings. It's using rabbitmq server.
Settings:
CELERY_BROKER_POOL_LIMIT=0
CELERY_CELERYD_PREFETCH_MULTIPLIER=1
CELERY_BROKER_CONNECTION_TIMEOUT=20
CELERY_BROKER_CONNECTION_RETRY=True
CELERY_BROKER_CONNECTION_MAX_RETRIES=100
CELERY_BROKER_HEARTBEAT=10
CELERY_TASK_SEND_SENT_EVENT =True
CELERY_CELERYD_SEND_EVENTS =True
CELERY_RESULT_BACKEND='rpc://'
CELERY_CELERYD_MAX_TASKS_PER_CHILD=500
CELERY_ACCEPT_CONTENT = ['json']
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'
CELERY_TASK_ROUTES= {
'parse_x.*': {
'queue': 'parse_x',
'routing_key': 'parse_x',
},...
}
tasks:
@shared_task(name="solicitor_tracker.test_print")
def test_print(i):
print(i)
time.sleep(0.1)
@shared_task(name="setup_queue.test_print_task_setup_queue",acks_late=False,
autoretry_for=(Exception,), retry_backoff=True)
def redirection_check_setup_queue():
for i in range(0,100000):
test_print.apply_async([i],queue="lawsociety_parse")