0

As far as my understanding goes, most browsers support gzip, and can automatically decompress data if the 'Content-Encoding' header is set within the response?

My problem however is that once I stream content to a client, only images get decompressed successfully. Im not sure if this might have somthing to do with how I am using streams in NodeJS?

I wont go into detail on how exactly I do upload content to the server, but in short - It is saved as fileName.gz and I can verify that it is compressed correctly as I can manually extract them.

So basically, only images work while video and audio does not. Here is a snippet of my code that shows how I handle the streaming:

exports.Stream = async (req, res) => {
    try {
        var resourcePath = 'something.gz'

        //The file exists
        const stat = fs.statSync(resourcePath);
        const fileSize = stat.size;
        const range = req.headers.range;

        if (range) {
            const parts = range.replace(/bytes=/, '').split('-');
            const start = parseInt(parts[0], 10);
            const end = parts[1]
                ? parseInt(parts[1], 10)
                : fileSize - 1;

            const chunksize = (end - start) + 1;
            const file = fs.createReadStream(resourcePath, { start, end });

            const head = {
                'Content-Range': `bytes ${start}-${end}/${fileSize}`,
                'Accept-Ranges': 'bytes',
                'Content-Length': chunksize,
                'Content-Type': 'video/mp4',
                'Content-Encoding': 'gzip'
            };
            res.writeHead(206, head);

            file.pipe(res);

        } else {
            const file = fs.createReadStream(resourcePath);

            const head = {
                'Content-Encoding': 'gzip',
                'Content-Type': 'video/mp4'
            };
            res.writeHead(206, head);

            file.pipe(res);
        }
    } catch (err) {
        console.log(err);
        response = {
            message: 'Error'
        }
        res.status(400).json(response);
    }
}

I have read somewhere that the client side code might read data into a buffer and not decompress it correctly or something. unfortunately I cant find the site I read that on anymore.

Furthermore, is it even possible to stream larger gzip files like this to a client? Are there other ways to better do this?

It is also worth wile to mention that I previously stored the content raw, and used gzip compression on the fly as it is streamed, but this however leads to excessive CPU usage. My thoughts were that it should be more efficient to simply stream compressed content.

Also, my client side code is very simple. I am using Angular 15 and simply calling the URL within my template like this, but I suspect that this is not an Angular issue and only found it worth while to mention in setting the context of my problem.

<img src="serverURL/stream/content.gz">

<video controls>
    <source src="serverURL/stream/content.gz" type="video/mp4">
</video>

This is the current error I get in my console:

net::ERR_CONTENT_DECODING_FAILED

If I do remove Content-Encoding header, the error dissapears, however nothing is displayed and the videos, images and audio never load

Paul Brink
  • 332
  • 1
  • 4
  • 21
  • The streaming input is not a valid gzip format, thats all that the error says. Maybe the data is truncated, corrupted or not partially compressed(if at all)? https://stackoverflow.com/questions/7625251/compression-and-decompression-of-data-using-zlib-in-nodejs It fits well with your description that only the images decompress while the other data is lost. – Mabadai Apr 15 '23 at 19:55

0 Answers0