0

I wish to allow registered users of a Django app to upload and view files to an s3 bucket.

With the help of the first commenters on this question and this answer on stackoverflow I was able to get this working using generated presigned URLs with no need for allowing Public Access or any Policies on my S3 bucket.

Can anyone help with what my policy & settings should be to allow custom domains and not use pre-signed URLS for static files?

Many thanks. -- R

Settings.py

AWS_ACCESS_KEY_ID = env('AWS_ACCESS_KEY_ID')
AWS_SECRET_ACCESS_KEY = env('AWS_SECRET_ACCESS_KEY')
AWS_STORAGE_BUCKET_NAME = env('AWS_STORAGE_BUCKET_NAME')
AWS_S3_OBJECT_PARAMETERS = {'CacheControl': 'max-age=86400',}
AWS_STATIC_LOCATION = 'static'
'STATICFILES_STORAGE = f'{ROOT_NAME}.storage_backends.StaticStorage'
AWS_PUBLIC_MEDIA_LOCATION = 'media/public'
DEFAULT_FILE_STORAGE = f'{ROOT_NAME}.storage_backends.PublicMediaStorage'
AWS_PRIVATE_MEDIA_LOCATION = 'media/private'
PRIVATE_FILE_STORAGE = f'{ROOT_NAME}.storage_backends.PrivateMediaStorage'

storage_backends.py

from django.conf import settings
from storages.backends.s3boto3 import S3Boto3Storage

class StaticStorage(S3Boto3Storage):
    location = settings.AWS_STATIC_LOCATION

class PublicMediaStorage(S3Boto3Storage):
    location = settings.AWS_PUBLIC_MEDIA_LOCATION
    file_overwrite = False

class PrivateMediaStorage(S3Boto3Storage):
    location = settings.AWS_PRIVATE_MEDIA_LOCATION
    default_acl = 'private'
    file_overwrite = False
    custom_domain = False

views.py


class DocumentCreateView(CreateView):
    model = Document
    fields = ['upload', ]
    success_url = reverse_lazy('home')

    def get_context_data(self, **kwargs):
        context = super().get_context_data(**kwargs)
        documents = Document.objects.all()
        context['documents'] = documents
        return context

models.py

from django.db import models
from django.conf import settings
from django.contrib.auth.models import User

from mysite.storage_backends import PrivateMediaStorage

class Document(models.Model):
    uploaded_at = models.DateTimeField(auto_now_add=True)
    upload = models.FileField()

class PrivateDocument(models.Model):
    uploaded_at = models.DateTimeField(auto_now_add=True)
    upload = models.FileField(storage=PrivateMediaStorage())
    user = models.ForeignKey(User, related_name='documents', on_delete=models.CASCADE,)
Roger
  • 70
  • 1
  • 9
  • 1
    You already have `s3:*` for all principals (not best practice, but ok for debugging), so other parts of your policy are not needed. Can you show the part of the code where you are actually doing the upload? Since you've put `s3:*` the only other thing that comes to my mind is that you're trying to upload file to some other bucket (or you have explicit deny for the IAM user you use for the upload) – Caldazar May 12 '22 at 13:32
  • @Caldazar much appreciated. Until I get things working I've tried to avoid any complications, so there's public access and "block all public access" is off, but I'm not clear how how to troubleshoot further. The same bucket is working to view images on the site. I've added the relevant code, let me know if it would be useful to see more. Thanks. – Roger May 12 '22 at 14:26
  • 1
    The two policies shown appear to be a) an IAM policy associated with your IAM user allowing all S3 permissions on the bucket and b) an S3 bucket policy allowing unauthenticated download of files from the bucket. I would first use another client such as the awscli, with the same IAM user credentials, to verify if you can upload to your bucket. – jarmod May 12 '22 at 14:32
  • @jarmod top tip, thanks... "aws s3 cp static/img/roger2022.jpg s3://mybucket/images/" worked fine. So I can definately upload. Any other tips gratefully received... – Roger May 12 '22 at 14:47
  • Is there anything unusual about the key of the object that you are trying to upload to? Does it contain any unusual characters e.g brackets, parentheses etc? Does it start with /? – jarmod May 12 '22 at 15:00
  • The bucket name is just four lower case letters and 3 numbers. The fact that I can upload from the cli leads me to think maybe it’s the code, which I took from a tutorial. But the lack of info from aws is frustrating. – Roger May 12 '22 at 15:07
  • That's the bucket name, but what about the key? Also, double-check that the AWS creds your Python app is actually using with boto3 storages are what you expected. – jarmod May 12 '22 at 15:39
  • Key is upper case letters then 2 numbers. I'm not sure what I changed, but now collectstatic is not uploading to my s3 bucket. – Roger May 12 '22 at 16:07
  • FYI, after checking everything carefully I was able to get this working with pre-signed URLS as now shown in the edited question above, with no need for a Policy or Public Access. I was hoping to use specific URLS for at least the statis assets tho, so I've updated my question. TY 4 ur help. – Roger May 16 '22 at 08:28

1 Answers1

1

I got everything working as needed with these settings and the help of this tutorial. I hope this is useful to someone.

AWS_STATIC_LOCATION = 'static'
STATICFILES_DIRS = [ os.path.join(BASE_DIR, '/static'), os.path.join(BASE_DIR, 'static'),]
STATICFILES_STORAGE = f'{ROOT_NAME}.storage_backends.StaticStorage'
AWS_PUBLIC_MEDIA_LOCATION = 'media/public'
DEFAULT_FILE_STORAGE = f'{ROOT_NAME}.storage_backends.PublicMediaStorage'
AWS_PRIVATE_MEDIA_LOCATION = 'media/private'
PRIVATE_FILE_STORAGE = f'{ROOT_NAME}.storage_backends.PrivateMediaStorage'
AWS_DEFAULT_ACL = None #'public-read'
#AWS_QUERYSTRING_AUTH = False
MEDIA_URL = "/media/"
0sVoid
  • 2,547
  • 1
  • 10
  • 25
Roger
  • 70
  • 1
  • 9