0

I cannot get celery results to store in my django-db backend- I have tried many solutions without any luck. I'm kind of stuck at this point.. however, I suspect this has something to do with cockroach-db.

My setup is using Django 4.1, Postgres (cockroach-db), and Redis (Heroku).

I start django normally, then I start the celery worker with this command: celery -A kmaofny worker --loglevel=info -E

My celery worker see's the tasks, and they are completed, however it does not store the results in my database.

Here are my settings and configurations:

settings.py

INSTALLED_APPS = [
...
'django_celery_results',
...
]

DATABASE_URL = POSTGRES_CREDS['URI']
DATABASES = {
    'default': dj_database_url.config(default=DATABASE_URL, engine='django_cockroachdb')
}

# Celery Configuration
REDIS_URL = os.environ.get('REDIS_URL', BROKER_URL)
CELERY_BROKER_URL = REDIS_URL
CELERY_RESULT_BACKEND = 'django-db'
CELERY_ACCEPT_CONTENT = ['application/json']
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'
DJANGO_CELERY_RESULTS_TASK_ID_MAX_LENGTH=191
CELERY_TASK_TRACK_STARTED = True
CELERY_TASK_TIME_LIMIT = 30 * 60

celery.py

import os
from celery import Celery
# Set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'kmaofny.settings')
app = Celery('kmaofny')
app.config_from_object('django.conf:settings', namespace='CELERY')
# Load task modules from all registered Django app configs.
app.autodiscover_tasks()

init.py

from .celery import app as celery_app
__all__ = ('celery_app',)

Any help is appreciated- I am stuck at this point with no errors. Further, I get <django_celery_results.backends.database.DatabaseBackend object at 0x000001A9B3B64040> when I call the following in a django view:

result = add_numbers.delay(1,5)
print(result.backend)

I called task.backend to check if the database was set up properly in settings, using -E flag on celery worker, and configuring CELERY_RESULT_BACKEND='django-db' in settngs.py

chrisr
  • 1
  • 2

1 Answers1

0

I'm not sure if this is the answer, but apparently all tasks (even though completed) were still listed as status=PENDING.

I solved the issue and now receive task status updates (and task status' = Success) in my django db backend when I run the celery worker with this

celery -A kmaofny  worker --loglevel=info -E --pool=solo

instead of

celery -A kmaofny  worker --loglevel=info -E

According to others this is likely a windows issue. See link:

Celery task always PENDING https://github.com/celery/celery/issues/2146 Celery 'Getting Started' not able to retrieve results; always pending

chrisr
  • 1
  • 2