7

I have set up my django REST API to use local storage when in DEBUG mode and S3 storage when in production environment. This works well for public files, because I override the DEFAULT_FILE_STORAGE like so:

if IS_DEBUG:
    DEFAULT_FILE_STORAGE = 'api.storage_backends.PublicMediaStorage'

and every FileField uses it automatically. Now I want to use private S3 storage the same way, but because I have to define the storage explicitly (FileField(storage=PrivateMediaStorage())), the S3 storage is always used.

How can I use the local storage instead of S3 storage when in DEBUG mode?

PS: I have already thought about changing the model to either use a FileField with or without an explicit storage depending on the DEBUG mode. This did not fully solve my problem, because my migrations are created in DEBUG mode and thus always contain the model without the private storage class.

UPDATE: I am looking for a solution that can share the same migrations in both environments and only during runtime lazily instantiates the actual storageclass. Just like django handles the DEFAULT_FILE_STORAGE already.

finngu
  • 457
  • 4
  • 23
  • 1
    Why not create a subclass of `PrivateMediaStorage` that does your `if IS_DEBUG:` and then do `FileField(storage=MyPrivateMediaStorage())`. – Red Cricket Dec 23 '19 at 18:45
  • Would that class need to subclass the boto3 class as well as the default storage? – finngu Dec 23 '19 at 20:35
  • Upgrading to Django 3.1+ fixes this. See answer here: https://stackoverflow.com/questions/32349635/django-migrations-and-filesystemstorage-depending-on-settings/68383051#68383051 – swinters Jul 14 '21 at 18:05

4 Answers4

9

The best solution is to use FileField without explicit storage class.

# settings.py

if DEBUG:
    DEFAULT_FILE_STORAGE = 'api.storage_backends.PublicMediaStorage'
else:
    DEFAULT_FILE_STORAGE = 'api.storage_backends.PrivateMediaStorage'


# models.py
class Foo(models.Model):
    file = models.FileField() # without storage

During the file upload process, Django will call the DEFAULT_FILE_STORAGE class in a lazy fashion.

Note

These settings won't create a migration file with storage parameter


UPDATE-1

If you want more controll over the storage, create your own custom file field and wire-up in the models

def get_storage():
    """
    Change this function to whatever way as you need
    """
    from api.storage_backends import PublicMediaStorage, PrivateMediaStorage
    if DEBUG:
        return PublicMediaStorage()
    else:
        return PrivateMediaStorage()


class CustomFileField(models.FileField):
    def __init__(self, *args, **kwargs):
        kwargs['storage'] = get_storage() # calling external function
        super().__init__(*args, **kwargs)


class Foo(models.Model):
    file = CustomFileField() # use custom filefield here
JPG
  • 82,442
  • 19
  • 127
  • 206
  • Checking if in DEBUG mode would be a great solution, I would also consider adding another variable like DEBUG_STORAGE so you can also debug just your storage if ever needed... – Jody Fitzpatrick Dec 23 '19 at 20:18
  • This is exactly what I am doing right now for public S3 storage, but this does not solve my problem. I need public as well as private S3 storage in production and cannot set two default storage classes. – finngu Dec 23 '19 at 20:33
  • @finngu I don't understand. Do you need to use `PublicMediaStorage` and `PrivateMediaStorage` in the same environment? – JPG Dec 24 '19 at 03:02
  • 2
    @finngu check the ***`UPDATE-1`*** section – JPG Dec 24 '19 at 03:45
  • Thank you for the answer. I have accepted Thomas Matecki's answer though because it allows me to use the default FileField. I know this is a small detail for most people, but when working with multiple developers, things like this can be forgotten easily. – finngu Dec 30 '19 at 09:15
7

It sounds like the tricky part here is having both public and private media storage in a single project.

The example below assumes you are using django storages, but the technique should work regardless.

Define a private storage by extending the S3BotoStorage class.

If using S3, it is probably prudent to store private and public public in different S3 buckets. This custom storage allows you to specify this parameter via settings.

# yourapp.custom_storage.py

from django.conf import settings
from django.core.files.storage import get_storage_class
from storages.backends.s3boto import S3BotoStorage

class S3PrivateStorage(S3BotoStorage):
    """
    Optional   
    """
    default_acl = "private"               # this does the trick

    def __init__(self):
        super(S3PrivateStorage, self).__init__()
        self.bucket_name = settings.S3_PRIVATE_STORAGE_BUCKET_NAME


# important
private_storage_class = get_storage_class(settings.PRIVATE_FILE_STORAGE)

private_storage = private_storage_class() # instantiate the storage

The important part is the last 2 lines of this file - it declares private_storage for use in your FileField:

from yourappp.custom_storage import private_storage
...
class YourModel(Model):

    the_file = models.FileField(
                   upload_to=..., 
                   storage=private_storage)
...

Finally, in your setting file, something like this should do.

# settings.py

if DEBUG:
    # In debug mode, store everything on the filestystem
    DEFAULT_FILE_STORAGE = 'django.files.storage.FileSystemStorage'
    PRIVATE_FILE_STORAGE = 'django.files.storage.FileSystemStorage'
else:
    # In production store public things using S3BotoStorage and private things
    # in a custom storage
    DEFAULT_FILE_STORAGE = 'storages.backends.s3boto.S3BotoStorage'
    PRIVATE_FILE_STORAGE = 'yourapp.custom_storage.S3PrivateStorage'

As a last piece of unsolicited advice: it is often useful to decouple the storage settings from DEBUG mode and allow all of the parameters above to be specified in environment variables. It is likely that at some point you will want to run your app in debug mode using a production-like storage configuration.

Thomas Matecki
  • 629
  • 5
  • 20
  • Thank you, this lazy instantiation is exactly what I was looking for. I already have the storage mode decoupled from DEBUG, just wanted to make the question a bit more general. Still good advice to other people looking this up :) – finngu Dec 30 '19 at 09:06
  • 3
    Two more things: 1. I had to replace `'django.files.storage.FileSystemStorage'` with `'django.core.files.storage.FileSystemStorage'`. 2. Apparently the lazy instantiation does not solve the problem with migrations. When I create migrations, the storageclass is either `FileSystemStorage` or `S3PrivateStorage`, but never `private_storage`. Thus I cannot apply my local migrations on the remote system. – finngu Jan 03 '20 at 09:39
1

Thomas's accepted answer is almost perfect. It has a small migration problem when you work with different settings for local development and production.

Suppose you set storage to FileSystemStorage in local environment and S3PrivateStorage in production. If you run makemigrations in the local environment, the migration file will set the storage field for your FileField to a different value than if you run makemigrations in the production environment.

Fortunately a new feature from Django 3.1 allows us to solve this easily with a slight change to Thomas's answer. Instead of using private_storage, which is an instance of a storage class, let's use the fact that you can use a callable as storage and create a function that will return the proper storage.

Then, the code (adapted from Thomas's answer) would be:

# yourapp.custom_storage.py

from django.conf import settings
from django.core.files.storage import get_storage_class
from storages.backends.s3boto import S3BotoStorage

class S3PrivateStorage(S3BotoStorage):
    """
    Optional   
    """
    default_acl = "private"               # this does the trick

    def __init__(self):
        super(S3PrivateStorage, self).__init__()
        self.bucket_name = settings.S3_PRIVATE_STORAGE_BUCKET_NAME

def select_private_storage():
    # important
    private_storage_class = get_storage_class(settings.PRIVATE_FILE_STORAGE)
    return private_storage_class() # instantiate the storage

and then in your field set the storage accordingly

from yourappp.custom_storage import select_private_storage
...
class YourModel(Model):

    the_file = models.FileField(
        upload_to=..., 
        storage=select_private_storage # notice we're using the callable
    )
...
Felipe Ferri
  • 3,488
  • 2
  • 33
  • 48
1

Recently released in Django 4.2 is a storages object.

This removes all the other hacks by moving to a reference-able object:

# settings.py
STORAGES = {
    "default": {
        "BACKEND": "django.core.files.storage.FileSystemStorage",
    },
    "custom_storage": {
        "BACKEND": "django.core.files.storage.FileSystemStorage",
    },
    "staticfiles": {
        "BACKEND": "django.contrib.staticfiles.storage.StaticFilesStorage",
    },
}

# example_app/models.py

from django.core.files.storage import storages

...
    avatar = models.FileField(
        blank=True,
        null=True,
        storage=storages["custom_storage"]
    )

Also, get_storage_class will be deprecated in the future.

nitsujri
  • 1,448
  • 2
  • 16
  • 29