1

I'm using django-pipeline with s3. I'm successfully using collectstatic to combined my Javascript files and store them in my s3 bucket, but they are not getting compressed for some reason (verified by looking at the file, its size, and its content-encoding). Otherwise things are working correctly with the combined scripts.js that is produced.

Here are the changes I made to use django-pipeline:

  1. Added pipeline to installed apps.
  2. Added 'pipeline.finders.PipelineFinder' to STATICFILES_FINDERS.
  3. Set STATICFILES_STORAGE = 'mysite.custom_storages.S3PipelineManifestStorage' where this class is as defined in the documentation, as seen below.
  4. Set PIPELINE_JS as seen below, which works but just isn't compressed.
  5. PIPELINE_ENABLED = True since DEBUG = True and I'm running locally.
  6. PIPELINE_JS_COMPRESSOR = 'pipeline.compressors.yuglify.YuglifyCompressor' even though this should be default.
  7. Installed the Yuglify Compressor with npm -g install yuglify.
  8. PIPELINE_YUGLIFY_BINARY = '/usr/local/bin/yuglify' even though the default with env should work.
  9. Using the {% load pipeline %} and {% javascript 'scripts' %} which work.

More detail:

PIPELINE_JS = {
    'scripts': {
        'source_filenames': (
            'lib/jquery-1.11.1.min.js',
            ...            
        ),
        'output_filename': 'lib/scripts.js',
    }
}

class S3PipelineManifestStorage(PipelineMixin, ManifestFilesMixin, S3BotoStorage):
    location = settings.STATICFILES_LOCATION

As mentioned, collectstatic does produce scripts.js just not compressed. The output of that command includes:

Post-processed 'lib/scripts.js' as 'lib/scripts.js'

I'm using Django 1.8, django-pipeline 1.5.2, and django-storages 1.1.8.

Similar questions:

Community
  • 1
  • 1
John Lehmann
  • 7,975
  • 4
  • 58
  • 71
  • 1
    Have you tried with `gzip` compression, we had gone through a similar error , full discussion at https://github.com/cyberdelia/django-pipeline/issues/312 – Shaikhul Jul 28 '15 at 15:55
  • I've found Pipeline is difficult to setup vs. django compressor. If you aren't using a task manager like gulp or grunt, I'd suggest trying that instead. – YPCrumble Jul 28 '15 at 19:23
  • @Shaikhul thank you, I had actually already found 312 and was trying to decipher. Can you provide an example for "you would need to change the staticfiles storage url method to return .gz urls (and staticfiles/pipeline template tags depending if you care for clients that don't support gzip). https://github.com/cyberdelia/django-pipeline/issues/312#issuecomment-33755035 Also don't forget to setup the proper header on s3 to serve theses assets as being gzipped." – John Lehmann Jul 29 '15 at 01:20

1 Answers1

3

The missing step was to also extend GZipMixin, AND, it has to be first in the list of parents:

from pipeline.storage import GZIPMixin

class S3PipelineManifestStorage(GZIPMixin, PipelineMixin, ManifestFilesMixin, S3BotoStorage):
    location = settings.STATICFILES_LOCATION

Now collectstatic produces a .gz version of each file as well, but my templates still weren't referencing the .gz version.

To address this the author says:

To make it work with S3, you would need to change the staticfiles storage url method to return .gz urls (and staticfiles/pipeline template tags depending if you care for clients that don't support gzip). Also don't forget to setup the proper header on s3 to serve theses assets as being gzipped.

I adapted an example he provided elsewhere, which overrides the url method:

class S3PipelineManifestStorage(GZIPMixin, PipelineMixin, ManifestFilesMixin, S3BotoStorage):
    location = settings.STATICFILES_LOCATION

    def url(self, name, force=False):
        # Add *.css if you are compressing those as well.
        gzip_patterns = ("*.js",)
        url = super(GZIPMixin, self).url(name, force)
        if matches_patterns(name, gzip_patterns):
            return "{0}.gz".format(url)
        return url

This still doesn't handle setting the Content-Encoding header.

A simpler alternative is to use the S3Boto Storages option AWS_IS_GZIPPED which performs gzipping AND sets the appropriate header.

More is required to support clients without gzip, however.

Also useful are these instructions from Amazon on serving compressed files from S3.

John Lehmann
  • 7,975
  • 4
  • 58
  • 71