0

I'm trying to send spaces at specific interval to avoid heroku timeout(30s), but I want to support the gzip encoding. So I'm trying something similar as below:

const express = require('express')
const zlib = require('zlib')

const app = express()
const gzipped_space = zlib.gzipSync(' ');

app.get('/json-chunked', function (req, res) {
  let interval = setInterval(() => {
    res.write(' ');
  }, 1000);

  setTimeout(() => {
    res.write('{"hello":"world"}');
    clearInterval(interval);
    res.end();
  }, 5500);
});

app.get('/gzip-chunked', function (req, res) {
  res.writeHead(200, {
    'Content-Encoding': 'gzip'      // setting the encoding to gzip
  });
  let interval = setInterval(() => {
    res.write(gzipped_space);
  }, 1000);

  setTimeout(() => {
    res.write(zlib.gzipSync('{"hello":"world"}'));
    clearInterval(interval);
    res.end();
  }, 5500);
});

app.listen(3000, () => {
  console.log('listening on 3000');
})

The http://localhost:3000/json-chunked works correctly in the browser and whole json response is received with spaces in the start. But for http://localhost:3000/gzip-chunked the browser seems to receive only the first space and the request is terminated. However the same request from the postman works correctly and whole response is received and decoded there.

Does browser expects the whole response to be one gzip body divided in chunks and not the smaller gzipped chunks?(It feels very odd that browser doesn't support the separately gzipped chunks :( ) Are there any other options with which I can send back the empty space the keep the connection alive?

EDIT: Are there any special characters in gzip that are ignored while decompressing?

Riddhesh
  • 571
  • 5
  • 18
  • Use `pipe` since `res` is a write stream. See: https://nodejs.org/api/zlib.html#zlib_zlib. You'll get something like `fs.createReadStream('/some/file').pipe(gzip).pipe(res)` – Erik Dec 13 '19 at 21:21
  • Actually I'm kind of already doing that, I'm making another request and piping that response to this one. But sometime it takes longer than 30s to start getting the response, so heroku times out the request. A [known workaround](https://spin.atomicobject.com/2018/05/15/extending-heroku-timeout-node/) is to send a space at specific interval to keep the connection going. I'm trying same but with gzip. – Riddhesh Dec 13 '19 at 21:57

1 Answers1

0

Here is a way to do it:

const zlib = require('zlib');
const express = require('express');

const app = express();

app.get('/gzip-chunked', function (req, res) {
  res.writeHead(200, {
    'Content-Encoding': 'gzip',      // setting the encoding to gzip
  });

  // Create a Gzip Transform Stream
  const gzip = zlib.createGzip();

  const interval = setInterval(() => {
    // Write a space character to the stream
    gzip.write(' ');

    // From Node.js docs: Calling .flush() on a compression stream will
    // make zlib return as much output as currently possible.
    gzip.flush();
  }, 1000);

  setTimeout(() => {
    gzip.write('{"hello":"world"}');
    clearInterval(interval);
    gzip.end();
  }, 5500);

  // Pipe the Gzip Transform Stream into the Response stream
  gzip.pipe(res);
});

app.listen(3000, () => {
  console.log('listening on 3000');
});
Mehmet Baker
  • 1,055
  • 9
  • 23
  • This works as expected. But this is simplified example. As I'm trying to handle the chunks in the middleware and in the controller, it calls a different request, like `request(options).pipe(res)`. And this data is already gzipped. With your soln I will need to somewhow unzip first and then pipe it to the the final zip again. Which feels very non optimal. – Riddhesh Dec 13 '19 at 23:17
  • To simplify, my response to send is already gzipped. So any option where I don't need to decompress is best. – Riddhesh Dec 13 '19 at 23:28
  • You need to mangle the headers and footers of the gzip output in order to concatenate them. For instance, gzip has an 8-byte footer containing a checksum of the entire gzip content. You would have to recalculate it (spaces + request body). Check out this answer for more information: https://stackoverflow.com/a/1143455/7231278 – Mehmet Baker Dec 14 '19 at 10:02
  • there is a npm module "compression" which does this all for you including checking the request headers to make sure it's supported. (just found it myself after reading this) – taxilian Sep 29 '21 at 17:36