3

I am working on a REST API (using Django Rest Framework). I am trying to upload a video by sending a post request to the endpoint I made.

Issue

The video does upload to the s3 bucket, but the upload progress shows 100% within a couple of seconds only however large file I upload.

Why is this happening and how can I solve this it?

PS: Previously I was uploading on local storage, and the upload progress was working fine. I am using React.

Harshit Gangwar
  • 553
  • 7
  • 14
  • Where is your upload progress showing 100%? Is it on the react side? What are you using to upload there? – monkut Oct 24 '20 at 08:33
  • I am using Axios to send the request and calculate the progress. Yes, the progress I am seeing is on the react side. The terminal through which the Django server is running response to the post request just fine after the file upload. The progress on the react side finishes in blink. – Harshit Gangwar Oct 24 '20 at 16:40
  • I figured that we can't show progress unless we are using web-sockets to connect to frontend. However, I still don't know how to get progress on backend from s3. – Harshit Gangwar Aug 10 '23 at 05:10

1 Answers1

0

First of all you make sure you've installed these library: boto3==1.14.53, botocore==1.17.53, s3transfer==0.3.3, django-storages==1.10

settings.py :

INSTALLED_APPS = [
 
    'storages',
]


AWS_ACCESS_KEY_ID = 'your-key-id'
AWS_SECRET_ACCESS_KEY = 'your-secret-key'
AWS_STORAGE_BUCKET_NAME = 'your-bucket-name'
AWS_S3_CUSTOM_DOMAIN = '%s.s3.amazonaws.com' % AWS_STORAGE_BUCKET_NAME

AWS_S3_OBJECT_PARAMETERS = {
    'CacheControl': 'max-age=86400',
}
DEFAULT_FILE_STORAGE = 'your_project-name.storage_backends.MediaStorage'

MEDIA_URL = "https://%s/" % AWS_S3_CUSTOM_DOMAIN

#File upload setting
BASE_URL = 'http://example.com'
FILE_UPLOAD_PERMISSIONS = 0o640
DATA_UPLOAD_MAX_MEMORY_SIZE = 500024288000

then make a storage_backends python file inside your project folder where settings.py file is located.

storage_backends.py:

import os
from tempfile import SpooledTemporaryFile

from storages.backends.s3boto3 import S3Boto3Storage


class MediaStorage(S3Boto3Storage):
    bucket_name = 'your-bucket-name'
    file_overwrite = False

    def _save(self, name, content):
        """
        We create a clone of the content file as when this is passed to
        boto3 it wrongly closes the file upon upload where as the storage
        backend expects it to still be open
        """
        # Seek our content back to the start
        content.seek(0, os.SEEK_SET)

        # Create a temporary file that will write to disk after a specified
        # size. This file will be automatically deleted when closed by
        # boto3 or after exiting the `with` statement if the boto3 is fixed
        with SpooledTemporaryFile() as content_autoclose:
            # Write our original content into our copy that will be closed by boto3
            content_autoclose.write(content.read())

            # Upload the object which will auto close the
            # content_autoclose instance
            return super(MediaStorage, self)._save(name, content_autoclose)


satyajit
  • 666
  • 1
  • 10
  • 25
  • I did exactly what you told @xxnora. The file is uploading, but the progress in finishing in just a blink. I put some print statements in the file. storage.backend is also running. What could be the reason it is still not fixed. – Harshit Gangwar Oct 24 '20 at 17:06