2

I am writing a component to stream a large (4+ GB) from an HTTP service. The component takes a URL and destination stream. The destination stream could be a filestream or it could be a stream that POSTS to different HTTP service, or even both. As the author of my component, I need to do these steps until I'm done:

  1. read a reasonable-size buffer from the HTTP stream,
  2. write this buffer to the destination stream,
  3. flush the destination stream (out to disk, network, etc)

I should never have more than the size of the buffer of data in memory.

I am using flurl to make my HTTP calls to the server. I've tried the following ways to make my call

var stream = await flurlClient.GetStreamAsync();

This gives me back a MemoryStream, which doesn't work as it'll fill up and take up as much memory as the size of the file.

var response = flurlClient.GetAsync();
var stream = response.Content.ReadAsStreamAsync();

Again, a memory stream.

var response = flurlClient.GetAsync();
var stream = new CustomFlushingSteam();
response.Content.CopyToAsync(stream);

This one looks promising, but alas, it tries to write the entire thing using a single Write() statement.

How can I accomplish this task without blowing up my memory? I'd prefer to use flurl, but I'm not tied to it.

afeygin
  • 1,213
  • 11
  • 26

1 Answers1

7

After doing some digging, I found that the following code solves my problem:

var response = flurlClient.SendAsync(
    HttpMethod.Get, null, null, HttpCompletionOption.ResponseHeadersRead);
var stream = response.Content.ReadAsStreamAsync();

In this case, the stream that comes back is no longer a memory stream but the network stream.

afeygin
  • 1,213
  • 11
  • 26
  • 2
    Yep, that's what you want to do. And it's what Flurl's [DownloadFileAsync](https://github.com/tmenier/Flurl/blob/master/src/Flurl.Http/DownloadExtensions.cs#L24) does internally. – Todd Menier Oct 21 '17 at 21:47
  • For future readers, it is the `HttpCompletionOption.ResponseHeadersRead` that gives the magic here. – Ruskin May 05 '22 at 11:33