37

AWS now supports gzipping files through CloudFront

I've followed along with all of the instructions in Serving Compressed Files, and yet gzipping is not working.

I have an S3 bucket set up as a website that CloudFront is using as the origin.

  • Compress Objects Automatically is enabled
  • I am serving files with the correct content types such as application/javascript and text/css
  • The files are within 1,000 and 10,000,000 bytes
  • The S3 website serves the files with a Content-Length as far as I know
  • To be extra sure nothing was cached, I both invalidated the entire S3 bucket and uploaded newer versions of the files to S3.
  • Additionally, the web browser I am using, Chrome, does accept gzipped files.

Despite all this, I can't get gzipping to work. I have gotten everything else including SSL working perfectly, and you can visit the site here: https://formulagrid.com/

If you open up the chrome console, you'll notice that none of the files being served from S3 are being gzipped. The only gzipped files such as the google font are the ones I'm grabbing from other CDNs.

m0meni
  • 16,006
  • 16
  • 82
  • 141
  • 3
    Thanks for sharing the actual link -- that should make whatever is happening easier for somebody to spot. Your objects are stored in S3 *without* a `Content-Encoding:` header, right? That is how they should be. If You have access to a system in another part of the world, you might try `curl -v` and see if requests through a different edge behave any differently. – Michael - sqlbot Feb 24 '16 at 03:27
  • @Michael-sqlbot yeah they only have `Content-Type`. Another thing I've noticed is that it seems like my index.html is actually being gzipped, but this only happens to be inconsistent, which makes me even more confused. – m0meni Feb 24 '16 at 03:51
  • @Michael-sqlbot it started working...guess it takes over a day or something... – m0meni Feb 24 '16 at 19:14
  • 1
    *"In rare cases, when a CloudFront edge location is unusually busy, some files might not be compressed."* This is a new feature, and probably a pretty popular one... they may have met some temporarily unexpected demand at your edge location. (That is, the one you're fetching through for your testing.) – Michael - sqlbot Feb 24 '16 at 22:00
  • @Michael-sqlbot haha yeah. I guess I was hoping it would actually be rare :'(. For the time being I've set up gzipping as part of my deployment so it shouldn't be so big of a deal anyways. – m0meni Feb 24 '16 at 22:02
  • 1
    "Summoning" people like this is generally frowned on by the community. If you tag a question appropriately, experts *will* see your question. – Michael - sqlbot Feb 25 '16 at 00:28
  • This seems resolved so consider closing the question yourself – Raniz Jan 03 '17 at 14:22
  • @Raniz unfortunately not...I just ended up manually gzipping the files myself, but that's not a solution – m0meni Jan 03 '17 at 14:23
  • @m0meni Did you make an awesome realization yet? Are there some undocumented conditions? – Daniel Birowsky Popeski Jan 09 '18 at 14:12
  • shit. I just ran `/*` invalidation on the cloudfront distro and it worked. doesn't make sense – Daniel Birowsky Popeski Jan 09 '18 at 14:22

4 Answers4

53

I hit the same error today and solved it by adding a CORS rule to the S3 bucket. This rule ensures the Content-Length header is sent to Cloudfront so content can be gzipped:

S3 > Bucket > Permissions > CORS Configuration

<?xml version="1.0" encoding="UTF-8"?>
<CORSConfiguration xmlns="http://s3.amazonaws.com/doc/2006-03-01/">
    <CORSRule>
        <AllowedOrigin>*</AllowedOrigin>
        <AllowedMethod>GET</AllowedMethod>
        <MaxAgeSeconds>3000</MaxAgeSeconds>
        <AllowedHeader>Authorization</AllowedHeader>
        <AllowedHeader>Content-Length</AllowedHeader>
    </CORSRule>
</CORSConfiguration>

Credit goes to Robert Ellison: http://ithoughthecamewithyou.com/post/enable-gzip-compression-for-amazon-s3-hosted-website-in-cloudfront

As far I know, this seems to be an undocumented requirement.

Rodrigo
  • 1,253
  • 16
  • 11
17

As Cloudfront now only accepts JSON, you have to paste this:

[
    {
        "AllowedHeaders": [
            "Authorization",
            "Content-Length"
        ],
        "AllowedMethods": [
            "GET"
        ],
        "AllowedOrigins": [
            "*"
        ],
        "ExposeHeaders": [],
        "MaxAgeSeconds": 3000
    }
]
Leon Braun
  • 385
  • 5
  • 11
7

My problem was that I uploaded the files specifically with utf-8 encoding. From the documentation:

CloudFront determines whether the file is compressible:

The file must be of a type that CloudFront compresses.

The file size must be between 1,000 and 10,000,000 bytes.

The response must include a Content-Length header so CloudFront can determine whether the size of the file is in the range that CloudFront compresses. If the Content-Length header is missing, CloudFront won't compress the file.

The response must not include a Content-Encoding header.

dom
  • 866
  • 8
  • 8
0

I had tried all of the above but CloudFront was still not compressing my S3 bucket content.

My problem was that I already had an existing Cloudfront distribution with compression disabled that I later turned on. This somehow didn't allow compression to be taken into account. After a lot of unsuccessful workarounds, I deleted the CloudFront distribution and recreated it with compression on from the get go. This solved my problem.

CefBoud
  • 214
  • 2
  • 4