I have an AWS Lambda function which calls a deep learning function on Algorithmia, does some post processing on the results and then returns some data. Algorithmia provides a python client which I am using that just makes things a little easier to send a request to an algorithm on the Algorithmia platform.
The problem is as follows: When an Algorithmia function hasn't been called for a while it is unloaded and the first call to warm it up (cold start) takes a while, possibly 30 seconds. If my Lambda function is going to be waiting for 30 seconds for a response whenever it happens to be triggering the Algorithmia function from a cold start that's going to be very expensive and wasteful.
Is there some way to send off a HTTP request in Lambda and when the request is finished the results are piped into a new Lambda function so as to not require a Lambda function to be waiting the entire time and wasting resources? I'd expect not as I'm not sure how that would practically work - does anyone have other ideas as to how to avoid waiting a while for a response and wasting Lambda resources?
Edit: In most cases (except obviously the ones where the Algorithmia algorithm takes a while to load from cold start) latency is an issue and I can't afford to increase latency by doing some workaround method with the Algorithmia function writing it's response to S3 (for example) and then triggering a Lambda function.