1

@Update: Changing the task from shared_task to app.celeryd.celery.task solves the issue. Is there some additional setup for shared_tasks to work properly?

Completely rewritten still the same error. I'll try to keep it relatively short. The new project structure is a bit cleaner and looks as follows:

proj
|-- app
|   |-- controller
|   |   |-- __init__.py
|   |   +-- greeting_model.py
|   |-- model
|   |   |-- __init__.py
|   |   +-- dto
|   |       |-- __init__.py
|   |       +-- greeting_dto.py
|   |-- service
|   |   |-- __init__.py
|   |   +-- greeting_service.py
|   |-- tasks
|   |   |-- __init__.py
|   |   +-- greeting_tasks.py
|   |-- __init__.py
|   |-- celeryd.py
|   +-- flaskd.py
|-- test.py
|-- worker.py
+-- ws.py

I'm initializing celery and flask separately and provide worker.py which shall be run on client machines, while ws.py (the flask web service) will run on another. Celery initialization is plain simple, and uses rpc backend with RabbitMQ broker. The 2 queues for now are static but later those will be populated from configuration.

from kombu import Queue
from celery import Celery


celery = Celery('LdapProvider',
                broker='amqp://admin:passwd@localhost:5672/dev1',
                backend='rpc',
                include=['app.tasks.greeting_tasks'])
celery.conf.task_queues = (
    Queue("q1", routing_key="c1.q1"),
    Queue("q2", routing_key="c2.q2"),
)

Worker.py (used to launch celery worker - overly simplified for this question):

from app.celeryd import celery as celery
from celery.bin.worker import worker


if __name__ == '__main__':
    celeryd = worker(app=celery)

    options = {
        'broker': 'amqp://admin:passwd@localhost:5672/dev1',
        'queues': 'q1',
        'loglevel': 'info',
        'traceback': True
    }

    celeryd.run(**options)

I'll omit flask initialization and jump to greeting_service.py which calls the celery task:

# greeting_service.py:
from app.celeryd import celery
from app.tasks.greeting_tasks import say_hello


class GreetingService(object):
    def say_hello(self, name: str) -> str:
        async_result = say_hello.apply_async((name,), queue='q1')
        return async_result.get()



# greeting_tasks.py
from celery import shared_task


@shared_task(bind=True)
def say_hello(self, name: str) -> str:
    return name.capitalize()

This call fails through flask whatever I try. I created test.py just to test if celery works at all:

from app.celeryd import celery
from app.tasks.greeting_tasks import say_hello


if __name__ == '__main__':
    async_result = say_hello.apply_async(('jackie',), queue='q1')
    print(async_result.get())

Pretty much the same as greeting_service.py it's just not called from greeting_controller which is a flask_restplus namespace. The difference that test.py results in:

/home/pupsz/PycharmProjects/provider/venv37/bin/python /home/pupsz/PycharmProjects/provider/test.py
Jackie

Process finished with exit code 0

[2020-01-16 18:56:17,065: INFO/MainProcess] Received task: app.tasks.greeting_tasks.say_hello[bb45e271-563e-405b-8529-7335a3810976]  
[2020-01-16 18:56:17,076: INFO/ForkPoolWorker-2] Task app.tasks.greeting_tasks.say_hello[bb45e271-563e-405b-8529-7335a3810976] succeeded in 0.010257695998006966s: 'Jackie'

while through flask all I get is the already shown and the worker log does not show any incoming task meaning through flask apply_async is not sending the task to RabbitMQ:

File "/home/xyz/PycharmProjects/proj/app/service/greeting_service.py", line 8, in say_hello
return async_result.get()
NotImplementedError: No result backend is configured.
Please see the documentation for more information.

I found one similar problem with django without an answer so I'm kind of stuck and would appreciate some kind of guidance.

Display name
  • 637
  • 1
  • 7
  • 16

1 Answers1

0

Solution: The solution to have the shared_task work as expected was answered here: LINK Modifying celerz initization as:

from kombu import Queue
from celery import Celery


celery = Celery('LdapProvider',
                broker='amqp://admin:passwd@localhost:5672/dev1',
                backend='rpc')
                # include=['app.tasks.greeting_tasks'])
celery.conf.task_queues = (
    Queue("q1", routing_key="c1.q1"),
    Queue("q2", routing_key="c2.q2"),
)
celery.set_default()

Even if I were to remove the commented out include line the worker successfully picks up the shared_task defined in app.tasks.greeting_tasks:

[tasks]
  . app.tasks.greeting_tasks.say_hello

After the app was set to default_app() no more NotImplementedError was thrown even when using shared_task. As for the reason... I have no idea, it was 6 hours of trial and error of different configs and googling. I find the official documentation lackluster in certain more complex situations.

Display name
  • 637
  • 1
  • 7
  • 16