0

Given the following javascript pseudo code (example 1),
as you can see, there are 3 async streams which in turn write to the response. They will of course write to the response in an async way, so the order of the chunks is not kept (it is actually unpredictable).

import pre from './pre';
import content from './content';
import post from './post';

export function renderIndex(req, res) {
  res.writeHead(200, {
    'Content-Type': 'text/html; charset=utf-8',
    'Transfer-Encoding': 'chunked',
  });

  const onEnd = () => {
    if(!pre._readableState.ended) return;
    if(!body._readableState.ended) return;
    if(!post._readableState.ended) return;

    res.end();
  };

  pre.on('data', (chunk) => { res.write(chunk); }).on('end', onEnd);
  body.on('data', (chunk) => { res.write(chunk); }).on('end', onEnd);
  post.on('data', (chunk) => { res.write(chunk); }).on('end', onEnd);
}

is it possible to tell the client the position of each chunk of data?

I'd like to achieve something like this:

// ---- Stream 1 keep open
<html>
  <head>
  ...
  ...
  ...
  ...


// --- Stream 2 keep open

<body>
  ...
  ...
  ...

// --- Stream 3 keep open
<script src="..."></script>
<script src="..."></script>
<script src="..."></script>


// --- Stream 1 CLOSE
  </head>

// --- Stream 2 CLOSE
  </body>

// --- Stream 3 CLOSE
  </html>


// res.end()
  1. Range Requests
  2. Multipart Range

Marble-like Explanation:

  • actual: [pre] [post] [body] [pre] [body] [/pre] [/post] [/body]
  • desired [pre] [/pre] [body] [/body] [post] [/post]
Hitmands
  • 13,491
  • 4
  • 34
  • 69
  • do you need serially pipe 3 read streams into one write stream? – kharandziuk May 24 '18 at 07:03
  • @kharandziuk I want those 3 streams to write on their position, but not serialise them (`res.pipe` is not an option I guess). – Hitmands May 24 '18 at 07:05
  • when you say, "their position" what do you mean? should you just check how much data did you already sent when some stream reports "end"? – kharandziuk May 24 '18 at 07:06
  • I am looking for a multipart response, by `keeping their position` I mean that `pre` should keep writing before `body` and `body` should keep writing before `post` – Hitmands May 24 '18 at 07:09
  • ok. the resulting "data" should look like p1p2p3p1p2p3... or p1p1p1p1p1p2p2p2p2p2p3p3p3p3 where pn is a chunck from stream `n`? – kharandziuk May 24 '18 at 07:12
  • It should look like `p1p1p1p1p1p2p2p2p2p2p3p3p3p3p[n]` but those streams should still be parallel – Hitmands May 24 '18 at 07:14

2 Answers2

1

I believe you can achieve expected behavior with a library called highland.js. It gives you a way to perform some operations on top of streams

/*
the sample to show how it works
H([
  H([1, 2, 3]),
  H([4, 5, 6])
]).sequence().pipe(process.stdout);
*/

import pre from './pre';
import content from './content';
import post from './post';
const H = require('highland')

export function renderIndex(req, res) {
  res.writeHead(200, {
    'Content-Type': 'text/html; charset=utf-8',
    'Transfer-Encoding': 'chunked',
  });

  H([
    pre,
    content,
    post
  ]).sequence().pipe(res);
}
kharandziuk
  • 12,020
  • 17
  • 63
  • 121
0

The simplest way to solve it is to write chunks into dedicated variables and on end event, write whole pre/body/post response to the res.

hsz
  • 148,279
  • 62
  • 259
  • 315
  • I need to stream the response. This would send the whole data at the end. – Hitmands May 24 '18 at 06:55
  • So it's not possible - streaming means - continuous stream of data - it's not possible to work with offsets here. You have to make it static or split your controller into parts and fetch those parts with XHR on the frontend side. – hsz May 24 '18 at 06:57
  • this should definitely be possible, since you can send the range position to the client – Hitmands May 24 '18 at 07:03