1

I've got a bunch of static files (eg index.xhtml) in an Apache2 web root. I don't have control over the server's configuration, but am allowed to modify .htaccess in the web root.

I would like to pre-compress the files (eg index.xhtml.gz) to improve load times and reduce bandwidth consumption. However, if I do this, user agents which do not support auto-detecting content encoding will be unable to work with the site.

I assume that these agents will be very rare compared to capable agents, so the content should be served decompressed only if the agent doesn't send gzip in the Accept-Encoding header. Agents which claim to support gzip but don't are of no concern.

Most sites regarding compression assume it's being performed on-the-fly, which I'd like to avoid to reduce consumed CPU time.

1 Answers1

1

AFAIK, only if you have access to run either a CGI script on the box or if you hack Apache.

But, the common practice is not to do what you're asking. The common practice is to store the files uncompressed, and then use mod_deflate to compress on the fly.

#
# Compress most things
#
<Location />
  SetOutputFilter DEFLATE
  BrowserMatch ^Mozilla/4 gzip-only-text/html
  BrowserMatch ^Mozilla/4\.0[678] no-gzip
  BrowserMatch \bMSIE !no-gzip !gzip-only-text/html

  # Don't compress images
  SetEnvIfNoCase Request_URI \.(?:gif|jpe?g|png)$ no-gzip dont-vary

  # Make sure proxies don't deliver the wrong content
  Header append Vary User-Agent env=!dont-vary
</Location>

That's in my httpd.conf, it'll have to change somewhat for .htaccess probably.

Michael Graff
  • 6,668
  • 1
  • 24
  • 36