I am building a FastAPI server for an image classification model. The API takes in image urls and downloads them concurrently using httpx. I want to limit the number of concurrent downloads across the server using asyncio.Semaphores. I tried creating a sempahore as a classifier attribute as I need it shared among all requests. But I get the error: got Future <Future pending created at /usr/lib/python3.9/asyncio/base_events.py:424> attached to a different loop
.
Here is my code:
# classifier.py
import asyncio
import httpx
class Classifier():
def __init__(
self,
concurrency_limit,
) -> None:
self.client = httpx.AsyncClient()
self.semaphore = asyncio.Semaphore(concurrency_limit)
async def download_async(self, url):
async with self.semaphore:
response = await self.client.get(url)
return await response.aread()
async def run(
self, image_urls
):
image_list = await asyncio.gather(
*[self.download_async(url) for i, url in enumerate(image_urls)]
)
# Infer Images
pass
# api.py
from fastapi import FastAPI
server = FastAPI()
classifier = Classifier(concurrency_limit=5)
@server.post("/")
async def index(urls):
results = await classifier.run(urls)
Answers online mention creating a semaphore within a running Event loop, but if I create it within run()
a new semaphore is created for every request. i.e each request allows 5 downloads concurrently. How do I use the same semaphore across all requests?