Questions tagged [gzip]

Gzip is any of several software applications used for file compression and decompression. The term usually refers to the GNU Project's implementation, "gzip" standing for GNU zip. It is based on the DEFLATE algorithm, which is a combination of Lempel-Ziv (LZ77) and Huffman coding.

Gzip is any of several software applications used for file compression and decompression. The term usually refers to the GNU Project's implementation, "gzip" standing for GNU zip. It is based on the DEFLATE algorithm, which is a combination of Lempel-Ziv (LZ77) and Huffman coding.

Gzip is based on the DEFLATE algorithm, which is a combination of LZ77 and Huffman coding. DEFLATE was intended as a replacement for LZW and other patent-encumbered data compression algorithms, which, at the time, limited the usability of compress and other popular archivers.

"Gzip" is often also used to refer to the gzip file format, which is:

  • a 10-byte header, containing a magic number, a version number and a time stamp
  • optional extra headers, such as the original file name
  • a body, containing a DEFLATE-compressed payload
  • an 8-byte footer, containing a CRC-32 checksum and the length of the original uncompressed data

source: wikipedia

334 questions
1
vote
2 answers

IIS6: Serve static content gzipped for clients that support it

How can I configure IIS to automatically serve static resources (pictures, javascript, css and so on) compressed for clients that support it - and serve "normal" content for clients that does not support compression ? Also, can IIS6 cache the…
driis
  • 207
  • 1
  • 4
  • 9
1
vote
2 answers

How to enable gzip for files not on your server?

I'm running a gtmetrix.com report against my site. I get an F(41) for gzip compression. I don't own the two problem files mentioned for this penality. One file is a CSS on Mailchimp's server. Another is a javascript file that can't be gzipped…
4thSpace
  • 113
  • 5
1
vote
1 answer

How is Apache 1.3 compressing?

More specifically, I would like suggestions as to how my server is encoding in gzip compressed format. We have once OC4J container that serves gzip transfer encoding, and others do not. Trying to turn compression off in the working container, we…
Dallas
  • 121
  • 5
1
vote
0 answers

nginx: how to serve files in gzip format when client accepts "Accept-Encoding: gzip" and inflated otherwise?

I have a folder full of .gz files and would like to serve them transparently inflated if requested by a client that does not send Accept-Encoding: gzip in the request and as-is (gzip'd) otherwise. I know there is the HttpGzipStaticModule module, but…
0xC0000022L
  • 1,516
  • 2
  • 22
  • 42
1
vote
1 answer

find and delete files of a certain type inside a tar.gz file

Is there a way to not only find but also delete any .gz files inside a .tar.gz file? I found this link but I wouldn't know how to modify it to make it able to delete found files.
Obay Ouano
  • 185
  • 1
  • 5
1
vote
1 answer

Varnish 3 performs gunzip even though entire pipeline is gzipped

I'm refactoring my Varnish VCL and can't figure out this one thing. Varnish 3.0 natively supports gzipped content, and it essentially seems to do the right thing. See also: https://stackoverflow.com/a/12962493/35434 However, Varnish still performs a…
Martijn Heemels
  • 7,728
  • 7
  • 40
  • 64
1
vote
2 answers

nginx gzip did not work on php js and css processors

I have a assets manager for process css and js via php. i'm using nginx and php5-fpm for load my application. but returned css and js files not gziped. for example my url is http://mysite.com/phpcssprocessor/mycssfile.css this file generate via…
1
vote
1 answer

how to get nginx to gzip all files and add an expires header

recently got into VPS systems and installing them, and all that blah blah etc etc. So, I have a couple working websites, and on one of them I'm trying to really optimize it for speed. Using Yahoo's ySlow as a guide, I am still failing the gzip and…
mazing
1
vote
2 answers

gzip compression increases performance, then eventually increases response time from server

Here is my current server scenario. I'm running on a Rackspace Cloud instance (16GB of RAM), using cPanel/WHM on a CentOS 5.5 install. I'm currently running about 10 Magento sites, all varying in size (from medium size to small) Over time I noticed…
Axel
  • 111
  • 2
1
vote
1 answer

NGINX + GZIP - Altering Vary header to include User-Agent

Currently, when nginx is set to gzip outbound content as requested by the client, the "gzip_vary on" setting will set the following header: Vary: Accept-Encoding We would like to modify this to send out: Vary: Accept-Encoding, User-Agent Is this…
anonymous-one
  • 1,018
  • 7
  • 27
  • 43
1
vote
2 answers

I cant get mod_gzip to function properly

The pagespeed tool says Compressing http://72.10.33.203/workspace/js/plugins.min.js could save 95.0KiB (66% reduction). but when I test for GZIP it shows that it works. I have my htaccess and conf files setup correctly (i think). How can I…
1
vote
1 answer

Apache LocationMatch throws 500 and AddOutputFilterByType does nothing

I need to add below directives to apache. But I get 500 when I add these lines. Header unset ETag FileETag None # RFC says only cache for 1 year ExpiresActive On ExpiresDefault "access plus 1…
tackleberry
  • 121
  • 4
1
vote
2 answers

Disk cloning using dd thru gzip - how effective is it?

Is gzip or gzip -9 for a disk clone worth the extra time that it takes to perform the compression? Does it yield a significant saving? There is very little information on how effective piping data from dd thru gzip really is. (Some data simply…
thinice
  • 4,716
  • 21
  • 38
1
vote
1 answer

Enabling gzip compression in apache2 web server

I am trying to enable the zgip compression in apache2 web server, a quick googling helped me to fetch some instructions and I got to know that I need to edit httpd.conf which is inside /etc/apache2/ but my ubuntu 10.04 server doesnt have any content…
Jeevan Dongre
  • 741
  • 2
  • 17
  • 33
1
vote
1 answer

Why does tar -c dir | pigz > /mnt/nfs/dir.tgz use network, then cpu in cycles instead of both at once (with one bottlenecking)

I want to transfer a multi-terabyte directory to an nfs mounted directory most efficiently over a 1Gbit network (probably the limiting factor) 3 options - tar and compress in place, then copy copy, then tar and compress tar | compress Seems…