I am working on a Flask application that interacts with Microsoft SQL server using the pypyodbc library. Several pages require a long database query to load the page. We ran into the problem that while Python is waiting for an answer from the database no other requests are served.
So far our attempt at running the queries in an asynchronous way is captured in this testcase:
from aiohttp import web
from multiprocessing.pool import ThreadPool
import asyncio
import _thread
pool = ThreadPool(processes=1)
def query_db(query, args=(), one=False):
conn = pypyodbc.connect(CONNECTION_STRING)
cur = conn.cursor()
cur.execute(query, args)
result = cur.fetchall()
conn.commit()
def get(data):
# Base query
query = # query that takes ~10 seconds
result = query_db(query, [])
return result
def slow(request):
loop = asyncio.get_event_loop()
#result = loop.run_in_executor(None, get, [])
result = pool.apply_async(get, (1,) )
x = result.get()
return web.Response(text="slow")
def fast(request):
return web.Response(text="fast")
if __name__ == '__main__':
app = web.Application()
app.router.add_get('/slow', slow)
app.router.add_get('/fast', fast)
web.run_app(app, host='127.0.0.1', port=5561)
This did not work, as requesting the slow and fast page in that order still had the fast load waiting until the slow load was done.
I tried to find alternatives, but could not find a solution that works and fits our environment:
- aiopg.sa can do asynchronous queries, but switching away from SQL server to PostgreSQL is not an option
- uwsgi seems to be usable with Flask to support multiple threads, but it cannot be pip-installed on Windows
- Celery seems similar to our current approach, but it would need a message broker, which is non-trivial to set up on our system