6

Currently, I create a Celery worker + Flower monitoring solution based on

https://github.com/itsrifat/flask-celery-docker-scale

Celery worker and Flower monitoring both run in a same directory flask-celery

The reason is that, so that Flower has access to Celery worker code module, and the following command with -A flag would work

entrypoint: flower
command: -A tasks --port=5555 --broker=redis://redis:6379/0

This is what their docker-compose.yml looks like

worker:
  build:
    context: ./flask-celery
    dockerfile: Dockerfile
  depends_on:
    - redis
monitor:
  build:
    context: ./flask-celery
    dockerfile: Dockerfile
  ports:
    - "5555:5555"
  entrypoint: flower
  command:  -A tasks --port=5555 --broker=redis://redis:6379/0
  depends_on:
    - redis

Now, I would like to create another new worker task2.py code, which will sit inside new directory called flask-celery2.

So, how should I modify the Dockerfile and docker-compose.yml so that flower will be capable of monitoring both tasks and tasks2?

damon
  • 14,485
  • 14
  • 56
  • 75
Cheok Yan Cheng
  • 47,586
  • 132
  • 466
  • 875

2 Answers2

0

yes, you have to get this new worker running in a separate container(best practice) and configure in the dockerfile, then flower can automatically monitor this task as long as it is being pushed into the same message broker.

Sid
  • 71
  • 1
  • 1
  • 9
0

You have to use below steps. You have to to your project folder.

pip install flower
flower -A proj --port=5555

Then you can see in and check in your celery task in browser like http://yourIP:5555

upinder kumar
  • 819
  • 6
  • 8