4

We recently moved to Heroku avec we decided to store our assets on Amazon S3 with a Cloudfront distribution.

I use Django Pipeline to compress / compile my assets but I didn't manage to make it points to the correct version. When I run the "collectstatic" management command, it works well:

Post-processed 'css/compress_profile_school.css' as 'css/compress_profile_school.82973855aca5.css Post-processed 'css/compress_profile.css' as 'css/compress_profile.d120536e24f9.css Post-processed 'css/compress_document.css' as 'css/compress_document.864dd7603769.css ...

But when I run the app, it didn't point to the correct version (it uses the one with no hash).

The application is running here: http://dev.unishared.com/

It seems that the Django bundled staticfiles app can't point to the correct version too.

Each time I push new assets version, I have to invalidate my Cloudfront distribution which takes time..

Thanks for your help.

arnaud.breton
  • 2,054
  • 1
  • 24
  • 39
  • Can you add your STATIC_* and PIPELINE_* settings ? – cyberdelia Jan 26 '13 at 10:56
  • Here it is: class S3PipelineStorage(PipelineMixin, CachedFilesMixin, StaticStorage): pass PIPELINE_STORAGE = 'UniShared_python.website.helpers.amazons3.S3PipelineStorage' STATICFILES_STORAGE = 'UniShared_python.website.helpers.amazons3.S3PipelineStorage' DEFAULT_FILE_STORAGE = 's3_folder_storage.s3.DefaultStorage' – arnaud.breton Jan 26 '13 at 18:54

3 Answers3

1

Don't setup PIPELINE_STORAGE, unless you really know what you are doing, just setup STATICFILES_STORAGE. See storages documentation.

cyberdelia
  • 5,343
  • 1
  • 19
  • 16
1

Thanks to cyberdelia I managed to make it works.

First, the "CachedFilesStorage" only put hashname in the filenames if your DEBUG settings is turned off (= False). It runs well on my production server.

From here, the "collectstatic" command is uploading the right files on S3 (with hash in the name). I met a second problem: the URL cached is pointing to the S3 Bucket but not the Cloudfront defined in the "STATIC_URL" setting. I think it's related to django-storages / boto used by my custom storage which works with S3 and not Cloudfront:

class S3PipelineStorage(PipelineMixin, CachedFilesMixin, StaticStorage):
pass

(Static storage is a S3BotoStorage subclass with location sets to "static").

Now, I have to find a way to make it works properly with Cloudfront and not S3.

Thanks for your help!

EDIT:

I figured out to make it works via this post: Django-compressor: how to write to S3, read from CloudFront?

While you define the "custom domain" key, it will use the Cloudfront domain instead of the Amazon S3.

I forgot to mention that I had to put the AWS_QUERYSTRING_AUTH to False to make it works.

Community
  • 1
  • 1
arnaud.breton
  • 2,054
  • 1
  • 24
  • 39
0

Your situation may vary, but generally it's best to not serve this content with S3.

Excerpt from whitenoise docs

Shouldn’t I be pushing my static files to S3 using something like Django-Storages?

No, you shouldn’t. The main problem with this approach is that Amazon S3 cannot currently selectively serve gzipped content to your users. Gzipping can make dramatic reductions in the bandwidth required for your CSS and JavaScript. But while all browsers in use today can decode gzipped content, your users may be behind crappy corporate proxies or anti-virus scanners which don’t handle gzipped content properly. Amazon S3 forces you to choose whether to serve gzipped content to no-one (wasting bandwidth) or everyone (running the risk of your site breaking for certain users).

...snip...

The second problem with a push-based approach to handling static files is that it adds complexity and fragility to your deployment process: extra libraries specific to your storage backend, extra configuration and authentication keys, and extra tasks that must be run at specific points in the deployment in order for everything to work.

I use django-pipelines to join and minify my content at deploy time, whitenoise to serve static content, and Cloudfront CDN for caching.

I followed this how-to get it going.

It also means I'm only paying for CDN hits, not CDN + S3.

Rebs
  • 4,169
  • 2
  • 30
  • 34