0

Ok, so I'm running my own fork of NanoHttpd (a minimalist java web server, the fork is quite complex though), and I had to implement gzip compression on top of it.

It has worked fine, but it just turned out that firefox 33.0 on Linux mint 17.1 will not execute gzipped js files at all, although they load just fine and headers look OK etc. This does not happen on the same pc with chrome, or with any other browser I've tried, but still is something that I must get fixed.

Also, the js resources execute just fine if I disable gzipping. I also tried removing Connection: keep-alive, but that did not have any effect.

Here's the code responsible for gzipping:

private void sendAsFixedLength(OutputStream outputStream) throws IOException {
        int pending = data != null ? data.available() : 0; // This is to support partial sends, see serveFile()
        headerLines.add("Content-Length: "+pending+"\r\n");
        boolean acceptEncoding = shouldAcceptEnc();

        if(acceptEncoding){
            headerLines.add("Content-Encoding: gzip\r\n");
        }
        headerLines.add("\r\n");

        dumpHeaderLines(outputStream);//writes header to outputStream

        if(acceptEncoding)
            outputStream = new java.util.zip.GZIPOutputStream(outputStream);


        if (requestMethod != Method.HEAD && data != null) {
            int BUFFER_SIZE = 16 * 1024;
            byte[] buff = new byte[BUFFER_SIZE];
            while (pending > 0) {
                int read = data.read(buff, 0, ((pending > BUFFER_SIZE) ? BUFFER_SIZE : pending));
                if (read <= 0) {
                    break;
                }
                outputStream.write(buff, 0, read);

                pending -= read;
            }
        }
        outputStream.flush();
        outputStream.close();
    }

Fwiw, the example I copied this from did not close the outputStream, but without doing that the gzipped resources did not load at all, while non-gzipped resources still loaded ok. So I'm guessing that part is off in some way.

EDIT: firefox won't give any errors, it just does not excecute the script, eg:

index.html:

<html><head><script src="foo.js"></script></head></html>

foo.js:

alert("foo");

Does not do anything, despite that the resources are loaded OK. No warnings in console, no nothing. Works fine when gzip is disabled and on other browsers.

EDIT 2: If I request foo.js directly, it loads just fine.

EDIT 3: Tried checking the responses & headers with TemperData while having gzipping on/off. The only difference was that when gzipping is turned on, there is Content-Encoding: gzip in the response header, which is not very suprising. Other than that, 100% equal responses.

EDIT 4: Turns out that removing content-length from the header made it work again... Not sure of the side effects tho, but at least this pinpoints it better.

Seppo420
  • 2,041
  • 2
  • 18
  • 37
  • Two things come to my mind: `InputStream.available` is not always safe: Might return 0 temporarily while the buffer is empty, but return > 0 in other moment (would be OK if `data` is guranteed to be 100% in memory). The other thing I suspect is that your browser might be caching an older response from `foo.js`. Better empty the cache before each test. – Little Santi Sep 15 '15 at 10:45
  • Cache issues can be ruled out safely. InputStream.available is not the issue, because content length is fine as is the entire response, absolutely nothing out of the ordinary in the response or its headers. – Seppo420 Sep 15 '15 at 10:53
  • Have you debugged how many bytes are written at `sendAsFixedLength` and compared it to the size of the actually received file? – Little Santi Sep 15 '15 at 11:05
  • Are you monitoring the network traffic to/from Firefox with some plugin? I recommend you "Tamper Data". – Little Santi Sep 15 '15 at 11:16
  • No. Installed it. No idea what I'm supposed to do with it or how to get it to do anything in the first place. Based on the dev tools headers are 100% fine as is the response body. – Seppo420 Sep 15 '15 at 11:25
  • 1
    Let us [continue this discussion in chat](http://chat.stackoverflow.com/rooms/89674/discussion-between-little-santi-and-seppo420). – Little Santi Sep 15 '15 at 11:27

1 Answers1

1

I think the cause of your problem is that you are writing the Content-Length header before compressing the data, which turns out in an incoherent information to the browser. I guess that depending on the browser implementation, it handles this situation in one or other way, and it seems that Firefox does it the strict way.

If you don't know the size of the compressed data (which is understandable), you'd better avoid writing the Content-Length header, which is not mandatory.

Little Santi
  • 8,563
  • 2
  • 18
  • 46