2

I have a Django web application that is deployed, in production, with Caddy. I use Caddy as a reverse proxy pointing to daphne which is pointing to my Django app. However, when I try to upload a 5MB file to the django admin portal in production I get a 413 error. In debug mode, when I am just using Django (without caddy or daphne), I do not get this error. Anyone have any ideas? Here is my Caddyfile and related files:

0.0.0.0:2015
on startup daphne peptidedb.asgi:application &

header / {
  -Server

    # be sure to plan & test before enabling
    # Strict-Transport-Security "max-age=63072000; includeSubDomains; preload"

    Referrer-Policy "same-origin"
    X-XSS-Protection "1; mode=block"
    X-Content-Type-Options "nosniff"

    # customize for your app
    #Content-Security-Policy "connect-src 'self'; default-src 'none'; font-src 'self'; form-action 'self'; frame-ancestors 'none'; img-src data: 'self'; object-src 'self'; style-src 'self'; script-src 'self';"
    X-Frame-Options "DENY"
}

proxy / localhost:8000 {
    transparent
    websocket
    except /static
}

limits 750000000

log / stdout "{combined}"

errors stdout

asgi.py

import os

from channels.routing import get_default_application
import django

os.environ.setdefault("DJANGO_SETTINGS_MODULE", "peptidedb.settings")
django.setup()
application = get_default_application()

wsgi.py

import os

from django.core.wsgi import get_wsgi_application

os.environ.setdefault("DJANGO_SETTINGS_MODULE", "peptidedb.settings")

application = get_wsgi_application()
RubyJ
  • 193
  • 2
  • 16
  • I have used Django channels in production but with NGINX, supervisor, Daphne, asgi.py, and redis. In that case with large files, they don't have any issues. – Ishwar Jangid Feb 05 '19 at 00:54
  • Interesting. I am not sure where my 413 error is coming from then. Somewhere there is an arbitrary file limit set. I have ruled out Django and Caddy because of the above. – RubyJ Feb 05 '19 at 16:08

1 Answers1

3

It looks like when the Django app is deployed with channels, Daphne, and caddy this setting takes effect in settings.py

DATA_UPLOAD_MAX_MEMORY_SIZE = 1024 # value in bytes

I had to add this setting in my settings file and then my larger file upload works. The weird part is I did not need this setting when the app was deployed with only Django in debug mode. I wonder if my app when running inside the docker container is not able (permission? size?) to write/stream the big file to disk as that is normal Django behavior (instead of writing to memory).

RubyJ
  • 193
  • 2
  • 16