0

I have a python web application to return some data from a DB. But when I have 100 concurrent users, it gets much slower. First, I thought it's the DB issue. But everything works fine on the DB. So I removed all the DB related code. It's just an aiohttp service by itself.

import asyncio
import asyncpg
from aiohttp import web
import os
import logging
import sys
import time

async def handle(request):
    return web.Response(text="hi")

async def init_app():
    """Initialize the application server."""
    app = web.Application()

    # Configure service routes
    app.router.add_route('GET', '/', handle)
    return app

logging.basicConfig(
    level=logging.DEBUG,
    format='%(levelname)7s: %(message)s',
    stream=sys.stderr,
)
LOG = logging.getLogger('')

loop = asyncio.get_event_loop()
app = loop.run_until_complete(init_app())
web.run_app(app)

Now, if I run one user, the response time is 2ms. But if I put 100 users, the response time 295 ms. Is this the nature of the aiohttp?

juanpa.arrivillaga
  • 88,713
  • 10
  • 131
  • 172
rickcoup
  • 275
  • 1
  • 5
  • 20

1 Answers1

1

I think your server spends the most time on printing incoming requests to stderr.

Logging to console is quite expensive, file or network is relatively cheap.

Andrew Svetlov
  • 16,730
  • 8
  • 66
  • 69
  • So I removed all the logging pieces. Running one user, the average response time is still 2ms. Then run 100 concurrent users, the response time is 103ms. It's improved a lot. But still there is about 100ms overhead under the load. Anything else can be done? – rickcoup Aug 23 '19 at 15:47