0

Is it possible to turn docker run command with detach=True into an asynchronous task? I am planning on running a continuous application that reads data from Kafka/RabbitMQ and processes it in docker containers. I can do this in a blocking way one-by-one, but my goal is to speed up a bit and run up to 4 tasks simultaneously. This would be a nice implementation with co-routines a asyncio.Semaphore. However, I am not sure how to turn background process into something I can control or read state. Any ideas?

My current minimal example uses while loop with single container

import docker
import json
from time import sleep

client = docker.from_env()

container = client.containers.run("my_container", "my", detach=True)

while True:

    container.reload()
    if not container.attrs['State']['Running']:
        print(container.logs())
        break
    sleep(1)
Bociek
  • 1,195
  • 2
  • 13
  • 28
  • Usually I’d set this up by having my worker(s) be long-running processes that did work as long as there was work to do. Then I’d start however many of them I wanted in parallel to my application. I wouldn’t try to launch a new container per request, and I’d try to avoid the dangerous path of accessing the host Docker socket. – David Maze Jul 15 '19 at 15:06
  • @David Maze for some reason I need fresh start for each message that is why I formulated it this way. – WoofDoggy Jul 15 '19 at 16:09

0 Answers0