I'm building a small Django project with cookiecutter-django and I need to run tasks in the background. Even though I set up the project with cookiecutter I'm facing some issues with Celery.
Let's say I have a model class called Job
with three fields: a default primary key, a UUID and a date:
class Job(models.Model):
access_id = models.UUIDField(default=uuid.uuid4, editable=False, unique=True)
date = models.DateTimeField(auto_now_add=True)
Now if I do the following in a Django view everything works fine:
job1 = Job()
job1.save()
logger.info("Created job {}".format(job1.access_id))
job2 = Job.objects.get(access_id=job1.access_id)
logger.info("Retrieved job {}".format(job2.access_id))
If I create a Celery task that does exactly the same, I get an error:
django.db.utils.ProgrammingError: relation "metagrabber_job" does not exist
LINE 1: INSERT INTO "metagrabber_job" ("access_id", "date") VALUES ('e8a2...
Similarly this is what my Postgres docker container says at that moment:
postgres_1 | 2018-03-05 18:23:23.008 UTC [85] STATEMENT: INSERT INTO "metagrabber_job" ("access_id", "date") VALUES ('e8a26945-67c7-4c66-afd1-bbf77cc7ff6d'::uuid, '2018-03-05T18:23:23.008085+00:00'::timestamptz) RETURNING "metagrabber_job"."id"
Interestingly enough, if I look into my Django admin I do see that a Job
object is created, but it carries a different UUID as the logs say..
If I then set CELERY_ALWAYS_EAGER = False
to make Django execute the task and not Celery: voila, it works again without error. But running the tasks in Django isn't the point.
I did quite a bit of searching and I only found similar issues where the solution was to run manage.py migrate
. However I did this already and this can't be the solution otherwise Django wouldn't be able to execute the problematic code with or without Celery.
So what's going on? I'm getting this exact same behavior for all my model objects.
edit: Just in case, I'm using Django 2.0.2 and Celery 4.1