2

I am using Daphne for both socket and http connections. I am running 4 worker containers and running everything locally right now in a docker container.

My daphne server fails if I try to upload a file that is 400MB. It works fine for small files upto 15MB.

My docker container quits with error code 137. I dont get any error in daphne logs. The daphne container just dies but the worker containers keep on running.

Does anyone know if there is a way to increase upload limits on daphne or I am missing something else?

I start the daphne server by daphne -b 0.0.0.0 -p 8001 project.asgi:channel_layer --access-log=${LOGS}/daphne.access.log

tinyhook
  • 342
  • 3
  • 15
  • 1
    Nothing being logged by Daphne suggests that it might be running out of memory... – solarissmoke Oct 28 '17 at 04:33
  • that is what I was suspecting too. Is there a way to fix this, I am using `docker-compose up` to run my containers. I have looked around and seen `oom-kill-disable` flag to pass into docker run. But I am not able to run using that. – tinyhook Oct 28 '17 at 05:26
  • I don't know enough about docker to say - if memory is the issue then you're going to have to find some way to increase what is available. – solarissmoke Oct 28 '17 at 05:33
  • 1
    @solarissmoke you were right in the sense that Docker was running out of memory. Once I allocated more memory everything started working. – tinyhook Jan 15 '18 at 15:45

1 Answers1

2

This is because daphne loads the entire HTTP POST request body completely and immediately before transferring control to the django with channels.

All your 400 MB are loaded into RAM here. Your docker container died due to the out of memory reason.

This happens even before checking for the size of the request body in django. See here

There is the open ticket here If you want to prevent it right now use a uvicorn instead daphne. Uvicorn must pass control to Django with chunks. And depending on the FILE_UPLOAD_MAX_MEMORY_SIZE django setting you will receive a temporary file on your hard disk (not in RAM). But you need to write your own AsyncHttpConsumer or AsgiHandler because AsgiHandler and AsgiRequest from channels do not support chunked body too. This will be possible after the PR.

don_vanchos
  • 1,280
  • 13
  • 13