1

I would assume when reading the chunks, it would start with 'applesauce', 'blueberry', etc. but instead the iterable that was used to initially create the stream is always last rather than first, despite the pushes being in the correct order. Can someone explain why the order is the way it is?

import stream from 'node:stream';

const a = stream.Readable.from(['applesauce', 'blueberry', 'muffins']);

a.push('one');
a.push('two');
a.push('three');
a.push('333');
a.push('555');
a.push('777');

for await (const chunk of a) {
  console.log(chunk);
}

Prints:

one
two
three
333
555
777
applesauce
blueberry
muffins

1 Answers1

2

How I'm reading the implementation of stream.Readable.from(), a readable stream will start reading from the iterable — and pushing its data onto its internal read queue — only once it's being read from itself.

So any data pushed onto the queue before reading the stream will precede the items from the iterable.

robertklep
  • 198,204
  • 35
  • 394
  • 381
  • Do you know if this would also apply to streams not created using the `.from()` method? For example, getting a stream from a request or something else, if `.push()` is called, does it have similar behavior. What you explained makes sense, but not necessarily the most intuitive design on node's part :D – purple-hippo Oct 25 '22 at 18:41
  • Scratch that, I guess it's fairly common to have a function that returns a stream, but the stream isn't actually populated on return (it's async), so immediately after the function call, the stream is actually empty, so any sync pushes are actually also added at the beginning of the stream, etc. Anyways, enough rambling, appreciate the help! – purple-hippo Oct 25 '22 at 18:46
  • @purple-hippo I agree it's counter-intuitive. I don't know if it's by design or not. – robertklep Oct 26 '22 at 06:10