22

I'm working on a demo and the code is simple:

# The Config
class Config:
    BROKER_URL = 'redis://127.0.0.1:6379/0'
    CELERY_RESULT_BACKEND = 'redis://127.0.0.1:6379/0'
    CELERY_ACCEPT_CONTENT = ['application/json']

# The Task
@celery_app.task()
def add(x, y):
    return x + y

To start the worker:

$ celery -A appl.task.celery_app worker --loglevel=info -broker=redis://localhost:6379/0

 -------------- celery@ALBERTATMP v3.1.13 (Cipater)
 ---- **** ----- 
 --- * ***  * -- Linux-3.2.0-4-amd64-x86_64-with-debian-7.6
 -- * - **** --- 
 - ** ---------- [config]
 - ** ---------- .> app:         celery_test:0x293ffd0
 - ** ---------- .> transport:   redis://localhost:6379/0
 - ** ---------- .> results:     disabled
 - *** --- * --- .> concurrency: 2 (prefork)
 -- ******* ---- 
 --- ***** ----- [queues]
 -------------- .> celery           exchange=celery(direct) key=celery

To schedule task:

>>> from appl.task import add
>>> r = add.delay(1, 2)
>>> r.id
'c41d4e22-ccea-408f-b48f-52e3ddd6bd66'
>>> r.task_id
'c41d4e22-ccea-408f-b48f-52e3ddd6bd66'
>>> r.status
'PENDING'
>>> r.backend
<celery.backends.redis.RedisBackend object at 0x1f35b10>

Then the worker will execute the task:

[2014-07-29 17:54:37,356: INFO/MainProcess] Received task: appl.task.add[beeef023-c582-42e1-baf7-9e19d9de32a0]
[2014-07-29 17:54:37,358: INFO/MainProcess] Task appl.task.add[beeef023-c582-42e1-baf7-9e19d9de32a0] succeeded in 0.00108124599865s: 3 

But the result remains PENDING:

>>> res = add.AsyncResult(r.id)
>>> res.status
'PENDING'

I've tried the official FAQ. But it did not help.

>>> celery_app.conf['CELERY_IGNORE_RESULT']
False

What did I do wrong? Thanks!

Hooked
  • 84,485
  • 43
  • 192
  • 261
hbrls
  • 2,110
  • 5
  • 32
  • 53
  • 1
    Did you ever figure out what was causing this problem? I am guessing that `res.get` would hang as well? – user25064 Jan 05 '15 at 14:17
  • `-broker=redis://localhost:6379/0` will not work as `CELERY_RESULT_BACKEND`. Make sure to pass `backend='redis://127.0.0.1:6379/0'` when instantiating `appl.task.celery_app` – Orelus Sep 28 '16 at 20:52

3 Answers3

14

Its been a while, but am leaving this more for others who come along with a similar issue:

In your screenshot, you see that the results are disabled

enter image description here

When you instantiate your celery instance, make sure that you have the right config inputs

from celery import Celery,Task

# here im using an AMQP broker with a memcached backend to store the results
celery = Celery('task1',broker='amqp://guest:guest@127.0.0.1:5672//',backend='cache+memcached://127.0.0.1:11211/')

For some reason, i always have trouble getting the celery instance parametered through the config file and hence explicitly passed in the broker and backend during instantiation as shown above

Now you'll see the results rightly configured to be memcached (in my instance - should be redis in yours). Also make sure that your task is picked up in the list of tasks (task1.add)

enter image description here

If you still cant get it to work, while starting celery try using the debug option as below

celery worker -A task1.celery -l debug

see if something is going wrong in the information it spews out

In my case, it fixed your error and result was set to success and i was able to recover 3 on r.get()

Shankar ARUL
  • 12,642
  • 11
  • 68
  • 69
1
  1. Try to change your broker to something else (like rabbitmq) and check the status again.

  2. Make sure your redis server is up and accessible for celery. redis-cli keys * and you should see some keys related to celery, if not it means there is a issue in your broker

Ali Nikneshan
  • 3,500
  • 27
  • 39
-1

This works for me:

from celery.result import AsyncResult
celery_task_result = AsyncResult(task_id)
task_state = celery_task_result.state

and task_state get all kinds of status: 'FAILURE', 'SUCCESS', 'PENDING', etc.

NoamG
  • 1,145
  • 10
  • 17