I deployed a Django project in Railway, and it uses Celery and Redis to perform an scheduled task. The project is successfully online, but the Celery tasks are not performed.
If I execute the Celery worker from my computer's terminal using the Railway CLI, the tasks are performed as expected, and the results are saved in the Railway's PostgreSQL, and thus those results are displayed in the on-line site. Also, the redis server used is also the one from Railway.
However, Celery is operating in 'local'. This is the log on my local terminal showing the Celery is running local, and the Redis server is the one up in Railway:
-------------- celery@MacBook-Pro-de-Corey.local v5.2.7 (dawn-chorus)
--- ***** -----
-- ******* ---- macOS-13.1-arm64-arm-64bit 2023-01-11 23:08:34
- *** --- * ---
- ** ---------- [config]
- ** ---------- .> app: suii:0x1027e86a0
- ** ---------- .> transport: redis://default:**@containers-us-west-28.railway.app:7078//
- ** ---------- .> results:
- *** --- * --- .> concurrency: 10 (prefork)
-- ******* ---- .> task events: OFF (enable -E to monitor tasks in this worker)
--- ***** -----
-------------- [queues]
.> celery exchange=celery(direct) key=celery
[tasks]
. kansoku.tasks.suii_kakunin
I included this line of code in the Procfile regarding to the worker (as I saw in another related answer):
worker: python manage.py qcluster --settings=my_app_name.settings
And also have as an environment variable CELERY_BROKER_REDIS_URL
pointing to Railway's REDIS_URL
. I also tried creating a 'Periodic task' from the admin of the live aplication, but it just doesn't get executed. What should I do in order to have the scheduled tasks be done automatically without my PC?