3

I'm trying to use stream_framework in my application (NOT Django) but I'm having a problem calling the stream_framework shared tasks. Celery seems to find the tasks:

-------------- celery@M3800 v3.1.25 (Cipater)
---- **** ----- 
--- * ***  * -- Linux-4.15.0-34-generic-x86_64-with-Ubuntu-18.04-bionic
-- * - **** --- 
- ** ---------- [config]
- ** ---------- .> app:         task:0x7f8d22176dd8
- ** ---------- .> transport:   redis://localhost:6379/0
- ** ---------- .> results:     redis://localhost:6379/0
- *** --- * --- .> concurrency: 8 (prefork)
-- ******* ---- 
--- ***** ----- [queues]
 -------------- .> celery           exchange=celery(direct) key=celery


[tasks]
  . formshare.processes.feeds.tasks.test_shared_task
  . stream_framework.tasks.fanout_operation
  . stream_framework.tasks.fanout_operation_hi_priority
  . stream_framework.tasks.fanout_operation_low_priority
  . stream_framework.tasks.follow_many
  . stream_framework.tasks.unfollow_many

[2018-09-17 10:06:28,240: INFO/MainProcess] Connected to redis://localhost:6379/0
[2018-09-17 10:06:28,246: INFO/MainProcess] mingle: searching for neighbors
[2018-09-17 10:06:29,251: INFO/MainProcess] mingle: all alone

I run celery with:

celery -A formshare.processes.feeds.celery_app worker --loglevel=info

My celery_app has:

from celery import Celery

celeryApp = Celery('task', broker='redis://localhost:6379/0', backend='redis://localhost:6379/0', include='formshare.processes.feeds.tasks')

The problem is that delay() does not run the shared task. I also created a shared task within my application but when I call delay() the task is also not called. I guess I need to register them as callable from my application? I don't seem to find any information online.

I also tried to auto discover the tasks but I got the same problem:

celeryApp.autodiscover_tasks(['stream_framework', 'formshare.processes.feeds'],force=True)

Any idea is highly appreciated.

QLands
  • 2,424
  • 5
  • 30
  • 50
  • After hours of trial and error I notices that it fails if I use Gunicorn to start my application but works if I use Waitress. Thus posted: https://stackoverflow.com/questions/52378871/calling-a-celery-shared-task-from-a-pyramid-request-different-behavior-with-gun – QLands Sep 18 '18 at 04:20

1 Answers1

6

Shared task are a specific thing used to actually share tasks between different applications (mainly Django apps I think, but I used them in flask for example).

We had the same issue and to get it to work we set

 celery_app.set_default()

On the celery instantiation

Otherwise another way of getting things right is to actually call the task via the app itself, so something around these lines

from celery import current_app
.
.
.
current_app.tasks['my.tasks.to.exec'].delay(something)

This always works as, given it s a shared task and therefore is not bound to any app when you import it, in this case it belongs to the app configured as the "current_app"

shipperizer
  • 1,641
  • 1
  • 13
  • 19
  • +1 The set_default workaround worked for me. I'm pretty sure we're using Celery incorrectly but after spending two days searching the web to figure out the right way, this really comes in handy and I'll stick with it for now. – Jérôme Jun 05 '20 at 07:48