I have a celery task as follows:
@celery_app_site24x7.task(queue='site24x7')
def createWebsiteMonitoring(**kwargs):
""" Celery Task to create or update website Moniroting """
time.sleep(100)
site24x7Instance = Business_api.thirdpartyFactory.instantiate(
"site24x7")
# site24x7Instance.login()
return site24x7Instance.createWebsiteMonitoring(**kwargs)
The issue is that my task is being run by multiple worker Fork POOL:
+++++++
[2019-01-23 05:40:59,674: INFO/ForkPoolWorker-3] Task celery_tasks.site24x7.createWebsiteMonitoring[a6eeff3d-fa01-4f2e-9921-849435c9b902] succeeded in 128.61932249739766s: '279832000003876475'
[2019-01-23 05:49:32,565: INFO/ForkPoolWorker-2] Task celery_tasks.site24x7.createWebsiteMonitoring[a6eeff3d-fa01-4f2e-9921-849435c9b902] succeeded in 127.42566008213907s: '279832000003877559'
+++++++
Hence the same task is being run multiple time.
On the other hand am having some errors in the worker logs as follows:
redis.exceptions.ConnectionError: Error while reading from socket: ('Connection closed by server.',)
[2019-01-23 06:58:01,453: WARNING/MainProcess] Restoring 15 unacknowledged message(s)
[2019-01-23 06:58:01,641: INFO/MainProcess] Connected to redis://46.19.177.13:6379/10
Some time:
[2019-01-23 05:58:19,791: WARNING/MainProcess] consumer: Connection to broker lost. Trying to re-establish the connection...
Is there any issue with my code? Why the same task is being run twice?