0

I need following workflow for my celery tasks.

when taskA finishes with success I want to execute taskB.

I know there is signal @task_success but this returns only task's result, and I need access to parameters of previous task's arguments. So I decided for code like these:

@app.task
def taskA(arg):
    # not cool, but... https://github.com/celery/celery/issues/3797
    from shopify.tasks import taskA
    taskA(arg)


@task_postrun.connect
def fetch_taskA_success_handler(sender=None, **kwargs):
    from gcp.tasks import taskB
    if kwargs.get('state') == 'SUCCESS':
        taskB.apply_async((kwargs.get('args')[0], ))

The problem is the taskB seems to be executed in some endless loop many, many times instead only once.

andilabs
  • 22,159
  • 14
  • 114
  • 151

1 Answers1

2

This way it works correctly:

@app.task
def taskA(arg):
    # not cool, but... https://github.com/celery/celery/issues/3797
    # otherwise it won't added in periodic tasks
    from shopify.tasks import taskA
    return taskA(arg)


@task_postrun.connect
def taskA_success_handler(sender=None, state=None, **kwargs):

    resource_name = kwargs.get('kwargs', {}).get('resource_name')

    if resource_name and state == 'SUCCESS':

        if sender.name == 'shopify.tasks.taskA':
            from gcp.tasks import taskB
            taskB.apply_async(kwargs={
                'resource_name': resource_name
            })

just for reference:

celery==4.1.0
Django==2.0
django-celery-beat==1.1.0
django-celery-results==1.0.1
flower==0.9.2
amqp==2.2.2

Python 3.6
andilabs
  • 22,159
  • 14
  • 114
  • 151