Requirement :
webservice to support initialization of training function of machine learning model and return success, which takes around 4 hours to complete.
To support predict function on previously trained models.
Both of above function should run in parallel in non blocking way.
we were able to achieve this by creating a task queue using celery and pushing the training function to the queue but wanted to know if there are better methods.
I looked for async webservice modules and found aiohttp. I wrote below sample code but seems like if I trigeer run_job function then predict function gets blocked.
from aiohttp import web
async def training_job():
for i in range(100100):
print(i)
return i
async def predict(request):
## some logic
text = "Value after logic"
return web.Response(text=text)
async def run_job(request):
result = await training_job()
return web.Response(text="Done")
app = web.Application()
app.add_routes([web.get('/', predict),
web.get('/run_job', run_job)])
web.run_app(app)