I'm looking for suggestions on how to perform non-buffered large file (gigabyte+) uploads from an externally facing Angular app to an externally facing Asp. Net 5.0 Web API endpoint and then on to an internal facing Asp. Net Web API endpoint. I have the first hop from Angular to the externally facing API working fine using guidance from Upload files in ASP.Net Core, however that only provides half of the solution. I'm able to upload a 2-gig file to the first endpoint without consuming too much memory (I am seeing ~300MB being consumed which is somewhat concerning, but I'll deal with that later). When I do try to stream to the internal endpoint I'm getting Unexpected end of Stream, the content may have already been read by another component. when I try to process the stream and write to disk.
Essentially, I'm looking for a way to have the DMZ act as a pass-through or proxy of sorts so that I can stream and store the file safely to an internal file store without exhausting server memory resources.
Below is C#, html and Typescript for my Angular to the DMZ endpoint flow -- I'm at a loss where to go from there. Any suggestions on how I can make this work without consuming too much memory or CPU?
My client code is quite simple at the moment
HTML
<div class="row" style="margin-bottom:15px;">
<div class="col-md-3">
<input type="file" #file placeholder="Choose file"
(change)="uploadFile(file.files)"
style="display:none;" multiple>
<button type="button" class="btn btn-success"
(click)="file.click()">Upload File</button>
</div>
<div class="col-md-4">
<span class="upload" *ngIf="progress > 0">
{{ progress }}%
</span>
<span class="upload" *ngIf="message">
{{ message }}
</span>
</div>
</div>
Typescript
public uploadFile = (files) => {
if (files.length === 0) {
return;
}
let filesToUpload : File[] = files;
const formData = new FormData();
Array.from(filesToUpload).map((file, index) => {
return formData.append('file'+index, file, file.name);
});
this.http.post('https://localhost:5001/api/upload/uploadFileToInternalAPI', formData,
{reportProgress: true, observe: 'events'})
.subscribe(event => {
if (event.type === HttpEventType.UploadProgress)
this.progress = Math.round(100 * event.loaded / event.total);
else if (event.type === HttpEventType.Response) {
this.message = 'Upload success.';
this.onUploadFinished.emit(event.body);
}
});
}
C#
Here's is my DMZ endpoint C# code. Note I'm attempting to support multiple files in a single request.
[HttpPost("uploadFileToInternalAPI")]
[DisableFormValueModelBinding] // Passing no parameters to the method essentially does the same thing as this attribute
[RequestSizeLimit(MaxFileSize)]
[RequestFormLimits(MultipartBodyLengthLimit = MaxFileSize)]
public async Task<IActionResult> UploadFileToInternalAPI()
{
var request = HttpContext.Request;
// validation of Content-Type
if (!request.HasFormContentType ||
!MediaTypeHeaderValue.TryParse(request.ContentType, out var mediaTypeHeader) ||
string.IsNullOrEmpty(mediaTypeHeader.Boundary.Value))
{
return new UnsupportedMediaTypeResult();
}
// Setup to get the first section (file) from the request
var reader = new MultipartReader(mediaTypeHeader.Boundary.Value, request.Body);
var section = await reader.ReadNextSectionAsync();
// Setup MultipartFormDataContent to post to internal API
var forwardingContent = new MultipartFormDataContent();
// Loop and process each section (file) in the multipart request.
// Builds up a multi-part request to forward to the internal API
while (section != null)
{
var hasContentDispositionHeader = ContentDispositionHeaderValue.TryParse(section.ContentDisposition
out var contentDisposition);
if (hasContentDispositionHeader && contentDisposition.DispositionType.Equals("form-data") &&
!string.IsNullOrEmpty(contentDisposition.FileName.Value))
{
// Get the filename from the section
var fileName = contentDisposition.FileName.Value.Trim('"');
// Stream content allows to pass a non-buffered stream
forwardingContent.Add(new StreamContent(section.Body), "file", fileName);
}
section = await reader.ReadNextSectionAsync();
}
// For example only -- Don't create a HttpClient like this in production
var client = new HttpClient { BaseAddress = new Uri("http://localhost:5002") };
client.DefaultRequestHeaders.Accept.Clear();
// *** Here's where things break down - I'm not sure how to
// forward the request onto the backend without consuming
// the stream content.
var response = await client.PostAsync("/internalUpload/upload", forwardingContent);
if (response.IsSuccessStatusCode)
{
return Ok();
}
// If the code runs to this location, it means that no files have been saved
return BadRequest("No files data in the request.");
}
Internal API C#
My Internal API is essneitally the same C# but instead of posting to an endpoint it will write the stream to disk file share.