10

I'm trying to download a 10GB file, but only 4GB get saved to disk, and memory is growing a lot.

const res = await fetch('https://speed.hetzner.de/10GB.bin');
const file = await Deno.open('./10gb.bin', { create: true, write: true })

const ab = new Uint8Array(await res.arrayBuffer())
await Deno.writeAll(file, ab)
Marcos Casagrande
  • 37,983
  • 8
  • 84
  • 98
Juan Carlos
  • 318
  • 2
  • 9

2 Answers2

19

You're buffering the response, that's why the memory is growing.


Deno.open now returns a FsFile which contains a WritableStream in .writable property, so you can just pipe the response to it.

const res = await fetch('https://speed.hetzner.de/10GB.bin');
const file = await Deno.open('./10gb.bin', { create: true, write: true })

await res.body.pipeTo(file.writable);
file.close();

If you want to do something else instead of writing to a file, res.body is a ReadableStream, so you could async iterate over it.

for await (const chunk of res.body) {
   // do something with each chunk
}

Regarding why it stops at 4GB I'm not sure, but it may have to do with ArrayBuffer / UInt8Array limits, since 4GB is around 2³² bytes, which is the limit of TypedArray, at least in most runtimes.


Updated my answer for latest Deno version

Marcos Casagrande
  • 37,983
  • 8
  • 84
  • 98
  • Thanks for the answer, it'll take a while to download the file to confirm it's working :P – Juan Carlos May 21 '20 at 23:38
  • With this approach i just get all files with 2KB. Code here: https://jsfiddle.net/1t6385aw/ –  Jul 11 '21 at 07:25
1

here is another shorter version.

const res = await fetch('https://speed.hetzner.de/10GB.bin');
const file = await Deno.open('./10gb.bin', { create: true, write: true });

await res.body?.pipeTo(file.writable);
file.close();
bigsan
  • 756
  • 6
  • 12