0

I'm trying to perform horizontal scaling of long processing task with docker compose. To simulate that task, I used the following code in python:

@app.get("/test")
def test():
    start = time.time()
    print("Start time", time.asctime(time.localtime(start)))
    time.sleep(5)
    end = time.time()
    print("End time", time.asctime(time.localtime(end)))
    return {"start":start, "end":end}

In docker compose I created 2 containers for the same code above. Mainly back1 and back2

version: '3.8'
services:
  back1:
    build:
      context: ./backend
      dockerfile: DockerFile
    volumes:
      - ./backend:/usr/code
    command: python main.py
    ports: 
      - "8001"
  back2:
    build:
      context: ./backend
      dockerfile: DockerFile
    volumes:
      - ./backend:/usr/code
    command: python main.py
    ports: 
      - "8001"

  nginx:
    build:
      context: ./backend/config
      dockerfile: DockerFile
    ports:
      - "80:80"

And this is my nginx conf file:

upstream backend {
    server back1:8001;
    server back2:8001;

    keepalive 32;
}

server {
    listen 80;
    server_name api.localhost;
    location / {
        proxy_pass http://backend;
    }
}

I tried to simultaneously trigger the API http://api.localhost/test. However it seems that the second request was activated after the 1st request is completed. Even though it was processed by different instance of container. enter image description here

Is nginx actually blocking?? I know that we can use queueing (eg: rabbitMQ, etc), but I thought that spinning up 2 docker container and using nginx as load balancer would achieve horizontal scalling. Am I wrong?

Arif Oyong
  • 392
  • 1
  • 3
  • 11

0 Answers0