27

I would like to implement GZIP compression on my site. I've implemented it on IIS and the HTML page is compressed successfully as expected.

Now the issue is with CSS and JS file, which I get from Amazon S3. They are not at all compressed. I wanted to compress them too.

Please guide me how to do it. Sharing links for it help me a lot.

Update: I've added Meta Header on S3 files as "Content-Encoding:gzip", now its showing in Response header. Still the file size is same and no effect of Particular CSS in page. And i can't even open it in browser. Here is the [link][1] of particular css.

Thanks

John Rotenstein
  • 241,921
  • 22
  • 380
  • 470
RaJesh RiJo
  • 4,302
  • 4
  • 25
  • 46

3 Answers3

23

Files should be compressed before being uploaded to Amazon S3.

For some examples, see:

Fed
  • 1,696
  • 22
  • 29
John Rotenstein
  • 241,921
  • 22
  • 380
  • 470
  • Thanks John, I understood. Can u plz share me if u have links to compress CSS and JS files in windows? – RaJesh RiJo Aug 05 '15 at 05:24
  • 1
    Just search for "windows gzip" and you'll find some compression utilities. – John Rotenstein Aug 05 '15 at 12:44
  • 1
    @RaJeshRiJo I recommend using 7zip for Windows http://www.7-zip.org/download.html – Virgil Shelton Oct 09 '15 at 01:28
  • 2
    This solution is now expired. Since [Dec. 2015, Cloudfront can now compress files on the way out](http://docs.aws.amazon.com/AmazonCloudFront/latest/DeveloperGuide/ServingCompressedFiles.html). – J_A_X Jul 21 '16 at 03:11
  • 2
    Cloudfront won't compress S3 files unless you add a CORS configuration to the bucket to expose the Content-Length header. See http://ithoughthecamewithyou.com/post/enable-gzip-compression-for-amazon-s3-hosted-website-in-cloudfront for details. – Robert Ellison Dec 10 '16 at 02:28
  • Please include the examples inline, as links may become stale. – Daniel Goldberg Nov 21 '20 at 10:22
  • Best answer. For those people who would like to use Cloudfront's compression feature, please know that compression is done on a best-effort basis (https://docs.aws.amazon.com/AmazonCloudFront/latest/DeveloperGuide/ServingCompressedFiles.html), which means there is no guarantee that it will compress everytime. – Vijay Purush Feb 27 '23 at 23:37
14

If you use CloudFront in front of your S3 bucket, there is no need to manually compress HTML ressources (CloudFront will compress them on-the-fly). Please note CloudFront only compress in gzip (no deflate, brotli) and only CSS / JS / HTML (based on content-type). See https://docs.aws.amazon.com/AmazonCloudFront/latest/DeveloperGuide/ServingCompressedFiles.html#compressed-content-cloudfront-file-types . To make it works, you have to forward some http headers from CloudFront to S3 (see doc).

If your S3 bucket have resources not supported by Cloudfront (generic "binary/octet-stream" mime type, like "hdr" texture or "nds" ROM), you need to compress them by yourself before uploading to S3, then set the "content-encoding" http meta on the resource. Note that only browsers supporting the gz encoding will be able to download and decompress the file.

If you don't want to compress the file one-by-one by the hand, you can use a Lambda function

  • triggered on each PUT of an object (a file) in the bucket
  • if the file is not already compressed and if compression is usefull, then replace the original uploaded file with the compressed version
  • set http headers content-encoding to gzip

I wrote a GIST for this, it can inspire you to create your own process. See https://gist.github.com/psa-jforestier/1c74330df8e0d1fd6028e75e210e5042

And dont forget to invalidate (=purge) Cloudfront to apply your change.

JayMore
  • 642
  • 6
  • 20
  • Do .json files count as JS? – JeanAlesi Dec 12 '20 at 11:29
  • Quick update and clarification - CloudFront now supports and prioritises brotli over gzip. It also compresses a lot more than just html, css, js... as long as it's between 1MB and 10MB (found this out when my 20MB geojson file was not being compressed and I had to compress before upload) – misteraidan Jun 23 '21 at 06:32
  • FYI file types (by content-type header): https://docs.aws.amazon.com/AmazonCloudFront/latest/DeveloperGuide/ServingCompressedFiles.html#compressed-content-cloudfront-file-types – misteraidan Jun 23 '21 at 07:07
  • 1
    Do NOT use this feature on CloudFront. Reason: CloudFront only compresses the resources when it has available CPU. Seems fine right? WRONG. If a file gets requested and CPU is not available then it gets served uncompressed and cached uncompressed and it remains there on that edge location until expiring or aging out from lack of access. Empirical results show that this happens often - random tests with web.dev/measure will show it. – huntharo Sep 23 '22 at 21:51
1

If you simply want to gzip the existing files in your S3 bucket, you can write a Lambda function for it. Read the files into a buffer and then use the gzip library to compress them and re-upload to S3. Something like this should work:

 gzipped_content = gzip.compress(f_in.read())
                destinationbucket.upload_fileobj(io.BytesIO(gzipped_content),
                                                        final_file_path,
                                                        ExtraArgs={"ContentType": "text/plain"}
                                                )

There's a full tutorial here: https://medium.com/p/f7bccf0099c9