I have setup a FastAPI REST API with AWS Gateway and trying to make use of the BackgroundTasks feature where I can do some work, create a unique token, schedule a background task and return immediately back to the client (which works great locally using postman and vscode) but when deployed, the API Gateway seems to be waiting for the lambda to completely finish and will return 504 gateway timeouts even though my main thread has returned. I'm not sure it's my version of FastAPI, Starlette, Mangum or lifecycle or other that's causing this.
I've read both a. it's impossible with API Gateway and b. it might be possible. I can't use the async innovation approach because I need to generate a unique token and do some work. The only way I can think about this is to call a second lambda with my payload, step functions or SQS.
Has anyone tried doing what I consider a typical solution for a "single lambda" to pre-process, spawn a thread-pool worker and return some token response?
I tested with sleep function on worker thread so we can pass in a timer value to wait - then it continues processing report.
The FastAPI doc says a non-async method will run_in_threadpool() which is what I thought would work.
Locally the worker does not block the main thread so I'm wondering if this is a function/design of the API gateway but it should in theory work. I just want to return and let the lambda run (I know this is not ideal)
def do_report(self):
if self.no_wait:
self.background_tasks.add_task(self._execute)
else:
return self._execute()
def _execute(self):
try:
# for testing purpose only, need to remove later
if self.no_wait and self.murphy:
time.sleep(self.murphy)
v_data = self.load_report_data(self.state_object.report_data)
.
.
.