I am working on a project which is deployed on docker swarm as a service with 3 replicas. I want to run a simple management command to delete some rows from a table if the date has passed. I have written a django command for it, but want to make the run automated using cron job. I do want to make sure the Job is run only once a day from any of the container which is part of my service. On the internet I found some packages built for running cron jobs for Django application, but none of them considers more than one containers. Some packages have lock based approach but it were file based locks and not a shared lock. I did not want to celery for this simple task.
Following is glimpse of my command:
class Command(BaseCommand):
"""Command to clear user subscription if end_date has passed"""
def handle(self, *args, **options):
try:
deleted_count, relative_deleted = MyModel.delete_inactive_instances()
except Exception:
raise CommandError('Could Not Remove Inactive Subscriptions From DB')
else:
self.stdout.write(self.style.SUCCESS('Successfully Removed Inactive Subscriptions %s ' % deleted_count))
I am currently running a command each day by docker exec:
python manage.py delete_inactive_instances
Following is my docker-stack file:
services:
production_app:
image: {id}.dkr.ecr.{region}.amazonaws.com/xxxxxx:latest
expose:
- 8000
deploy:
replicas: 2
command: >
sh -c "python manage.py migrate && gunicorn app.wsgi:application --workers 3 --bind 0.0.0.0:8000"
env_file:
- .prod.env
nginx:
image: {id}.dkr.ecr.{region}.amazonaws.com/nginx:latest
ports:
- 80:80