2

We tested fastAPI application with Guicorn with 20 workers, the code for which is below,

import json
from typing import List, Dict
from async_lru import alru_cache
from fastapi import FastAPI
import aioredis

REDIS_HOST: str = "localhost"
app: FastAPI = FastAPI()

@alru_cache
async def get_redis() -> aioredis.Redis:
    """
    :return:
    """
    return await aioredis.create_redis_pool((REDIS_HOST, 6379), encoding='utf8')

@app.get('/redisConnection')
async def async_function():
    redis_conn: aioredis.Redis = await get_redis()
    return None

@app.get('/doOneRedisCall')
async def do_one_redis_call():
    redis_conn: aioredis.Redis = await get_redis()
    data: str = await redis_conn.get('random_key')
    return json.loads(data)

@app.get('/doMultipleRedisCall')
async def do_one_redis_call():
    redis: aioredis.Redis = await get_redis()
    redis_key_list: List[str] = ['random_key_1', 'random_key_2', 'random_key_3', 'random_key_4']
    response: List[Dict] = []
    for key in redis_key_list:  # making 4 get calls to redis
        data: str = await redis.get(key)
        response.append(json.loads(data))
    return response

we run this application using following cmd,
gunicorn -w 20 -k uvicorn.workers.UvicornWorker --bind 0.0.0.0:80 main:app

System conf:
CPU : 8cores
RAM : 64GB
CPU MHz : 2700.516


for /redisConnection API the machine was able to take load upto 7000 request per second
for /doOneRedisCall API the machine was able to take load upto 4700 request per second
for /doMultipleRedisCall API the machine was able to take load upto 1700 request per second

The more redis calls are made, less the request rate could be handled by the machine, what parameters in Gunicorn could we change, such that it could handle more request rate.?
We tried increasing
worker-connections from 1000(default) to 10000
keep-alive - 2(default) to 10 sec
backlog - 2048(default) to 100000
there is no improvement in request rate. Is there anything that i'm missing here .?

  • 2
    Have you profiled your program to see where it's spending most of its time? If you haven't, do that first. Then you can start optimizing those bottlenecks. (That said, I'm not sure it's worth optimizing a toy program like this.) – AKX Jul 30 '21 at 11:45
  • Also: sounds like `doMultipleRedisCall` should use [`MGET`](https://redis.io/commands/mget) to get multiple values. – AKX Jul 30 '21 at 11:45
  • And now that I'm at it: there are faster JSON libraries than the built-in `json` out there; see e.g. [`orjson`](https://github.com/ijl/orjson). – AKX Jul 30 '21 at 11:46

0 Answers0