1

I have spend the last couple of days trying to set this up for my website and have been failing to understand this particular error.

  • I used the Django Cookiecutter boiler tamplate for my project.
  • I have added Celery locally with Redis to perform tasks, this works
    great.
  • I have deployed my project to Heroku.

I am now trying to implement the same tasks in my production environment but instead of Redis, I am now using a heroku add-on named Redis To Go. I did so as followed;

In my settings: I have copied this from the redis to go tutorial. No adjustments have been made other than changing urlparse import to meet Python 3 standards.

production.py
import os

from urllib.parse

import urlparse

redis_url = urlparse(os.environ.get('REDISTOGO_URL', 'redis://localhost:6959'))

CACHES = { 'default': {

'BACKEND': 'redis_cache.RedisCache',

'LOCATION': '%s:%s' % (redis_url.hostname, redis_url.port),

'OPTIONS': {

'DB': 0,

'PASSWORD': redis_url.password, }

}

}

Procfile

I've added the following line to my Procfile and added a second dyno for my worker.

worker: celery -A appname worker --beat

The worker seems to be working. State changes and then it closes. It obviously can't connect, see error.

Heroku Log

2020-05-16T02:07:58.647226+00:00 app[worker.1]: [2020-05-16 04:07:58,647: ERROR/MainProcess] consumer: Cannot connect to amqp://guest:**@127.0.0.1:5672//: [Errno 111] Connection refused.

What am I doing wrong?

My celery configuration:

Celery.py

from __future__ import absolute_import, unicode_literals
import os
from celery import Celery

# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'config.settings.local')

app = Celery('rebanq_pro')

# Using a string here means the worker don't have to serialize
# the configuration object to child processes.
# - namespace='CELERY' means all celery-related configuration keys
#   should have a `CELERY_` prefix.
app.config_from_object('django.conf:settings', namespace='CELERY')

# Load task modules from all registered Django app configs.
app.autodiscover_tasks()

@app.task(bind=True)
def debug_task(self):
    print('Request: {0!r}'.format(self.request))

settings.local.py

CELERY_BROKER_URL = 'redis://localhost:6379'
CELERY_RESULT_BACKEND = 'redis://localhost:6379'
CELERY_ACCEPT_CONTENT = ['application/json']
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'
CELERY_TIMEZONE = TIME_ZONE

settings.production.py

CELERY_BROKER_URL = os.environ.get("REDISTOGO_URL")
CELERY_RESULT_BACKEND = os.environ.get("REDISTOGO_URL")
CELERY_ACCEPT_CONTENT = ['application/json']
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'
CELERY_TIMEZONE = TIME_ZONE
jackygab
  • 58
  • 6
  • Please include your celery configuration. That error message indicates that celery is trying to use AMQP for the broker rather than redis. – schillingt May 18 '20 at 00:21
  • Updated my question with more information. – jackygab May 18 '20 at 01:10
  • Does the url in the error message match the `REDISTOGO_URL`? If not, is it possible you're not using the correct settings? – schillingt May 18 '20 at 01:50
  • This sounds very logical. Looking at my code I've found that in Celery.Py it refers to 'config.settings.local'... which refers to "CELERY_BROKER_URL = redis://localhost:6379". Chainging this fixed the error. [2020-05-18 03:12:51,591: INFO/MainProcess] celery@325c7732-0025-412e-81a1-6c3c37de6e7d ready. it seems my Celery worker is ready but not receiving any tasks. – jackygab May 18 '20 at 01:59

0 Answers0