0

All my services are running properly. But RQ-Worker doesn't do my scheduled job. Everything was fine locally, (so I think I just need to pass new redis docker service hostname to him, but how do it it inside docker-compose?)

My docker-compose.yml

version: '3.5'

services:

  web:
    build: ./webapp
    image: webapp
    container_name: webapp
    ports:
      - "5000:5000"
    depends_on:
      - redis-server
      - mongodb

  redis-server:
    image: redis:alpine
    ports:
      - 6379:6379
    expose:
      - '6379'

  mongodb:
    image: mongo:4.2-bionic
    container_name: mongodb
    ports:
      - "27017:27017"
    deploy:
      restart_policy:
        condition: always

  rq-worker:
    image: jaredv/rq-docker:0.0.2
    container_name: rq-worker
    command: rq worker -u redis://redis-server:6379 high normal low
    deploy:
      replicas: 1
    depends_on:
      - redis-server
davidism
  • 121,510
  • 29
  • 395
  • 339
mikazz
  • 159
  • 1
  • 11
  • That looks right. You have a couple of unnecessary extra options (`expose:`, `container_name:`, the Swarm-specific `deploy:`) but the way you're passing the Redis container name as a command-line option seems correct. – David Maze Apr 27 '20 at 12:19
  • @DavidMaze I found this question which might be helpfull: https://stackoverflow.com/questions/55001202/how-to-start-a-custom-rq-worker-within-a-docker-container-python-flask-and-red but I don't really get – mikazz Apr 27 '20 at 12:30
  • @DavidMaze To be more specific: To start a default RQ worker, the Flask Mega Tutorial uses the method of overriding the Docker entrypoint with "venv/bin/rq" and then supplying the argument "worker -u redis://redis-server:6379/0 microblog-tasks". – mikazz Apr 27 '20 at 12:31

0 Answers0