I am using Azure Functions, and have stumbled upon an issue.
When requesting large amount of data from an external source, it seems the stream is shut. I have simplified the example as simple as I can below it will silently fail, and return 500. By silent I mean there are no errors in Application Insights that I can see, and no way to know the issue.
This works locally as an azure function.
Is there some sort of limit on data that can be read? It doesn't take much memory (70mb or so locally). So really banging my head against a wall the last two days on this one. Any help appreciated!
[FunctionName("FeedDownloads")]
public HttpResponseMessage Run([HttpTrigger(AuthorizationLevel.Anonymous, "get" )]HttpRequestMessage req,
ILogger log)//, [FromQuery]string format = "", [FromQuery]bool debug = false)
{
System.Net.HttpWebRequest webRequest = (System.Net.HttpWebRequest)System.Net.WebRequest.Create("{large 1.5GB gzipped file}");
webRequest.AutomaticDecompression = System.Net.DecompressionMethods.GZip;
var webRequestResponse = webRequest.GetResponse();
var res = req.CreateResponse(HttpStatusCode.OK);
res.Content = new StreamContent(webRequestResponse.GetResponseStream());
res.Content.Headers.ContentType = new MediaTypeHeaderValue(webRequestResponse.ContentType);
return res;
}