Say we have a simple node.js transform stream:
export class JSONParser extends stream.Transform {
constructor() {
super({objectMode: true});
}
}
I want to process items synchronously for awhile and then delay the remainder. Something like this:
export class JSONParser extends stream.Transform {
count = 0;
constructor() {
super({objectMode: true});
}
_transform(chunk, encoding, cb) {
const modifiedChunk = this.modify(chunk);
if(count++ % 55 === 0){
process.nextTick(() => this.push(modifiedChunk));
return;
}
this.push(modifiedChunk);
}
}
in theory this means that for every 55 items or so, the stream will wait to the next tick to process the remaining items. Question -
will this indeed delay process of all remaining items, or just this one chunk? Will it preserve order of the chunks that get pushed?
I believe a token bucket algorithm can do rate limiting, and maybe that's a better way to achieve a non-event-loop blocking stream?