4

I am working on a Flask application that interacts with Microsoft SQL server using the pypyodbc library. Several pages require a long database query to load the page. We ran into the problem that while Python is waiting for an answer from the database no other requests are served.

So far our attempt at running the queries in an asynchronous way is captured in this testcase:

from aiohttp import web
from multiprocessing.pool import ThreadPool

import asyncio
import _thread
pool = ThreadPool(processes=1)

def query_db(query, args=(), one=False):
    conn = pypyodbc.connect(CONNECTION_STRING)
    cur = conn.cursor()
    cur.execute(query, args)
    result = cur.fetchall()
    conn.commit()

def get(data):
    # Base query
    query = # query that takes ~10 seconds

    result = query_db(query, [])

    return result

def slow(request):
    loop = asyncio.get_event_loop()
    #result = loop.run_in_executor(None, get, [])
    result = pool.apply_async(get, (1,) )
    x = result.get()

    return web.Response(text="slow")


def fast(request):
    return web.Response(text="fast")

if __name__ == '__main__':
    app = web.Application()
    app.router.add_get('/slow', slow)
    app.router.add_get('/fast', fast)

    web.run_app(app, host='127.0.0.1', port=5561)

This did not work, as requesting the slow and fast page in that order still had the fast load waiting until the slow load was done.

I tried to find alternatives, but could not find a solution that works and fits our environment:

  • aiopg.sa can do asynchronous queries, but switching away from SQL server to PostgreSQL is not an option
  • uwsgi seems to be usable with Flask to support multiple threads, but it cannot be pip-installed on Windows
  • Celery seems similar to our current approach, but it would need a message broker, which is non-trivial to set up on our system
Dylan
  • 51
  • 1
  • 5

1 Answers1

1

It sounds like the issue isn’t the Flask application itself but rather your WSGI server. Flask is designed to handle one request at a time. To serve the app so that multiple people can hit it simultaneously, you should configure the WSGI server to use more workers. If a request hits the server while your long process is running, a worker will generate a new instance of the app and serve the request. This is easy to set up in IIS.

Of course, if you have four workers and then four clients run the long function simultaneously then you’re back in this situation. If that will happen frequently you can assign more workers or move to a different WSGI framework that supports async like Quart or Sanic.

The approach you’ve outlined above should speed up the execution of the long process, though. But Flask itself is not designed to await. It holds the thread until it’s finished.

More details in this answer: https://stackoverflow.com/a/19411051/5093960

susodapop
  • 402
  • 4
  • 10