0

In my setup of Apache 2.2 MPM worker and Django 1.3 with mod_wsgi 2.8, I need to support large POST request payloads. The problem is that when there are many such simultaneous requests, Apache uses up all the memory in the system and then crashes. It seems that Apache is buffering the requests completely in memory before executing the WSGI handler and passing it the request. Is there any way to control request buffering in Apache? The log shows the following error whenever the crash happens:

[Wed Jun 29 18:35:27 2011] [error] cgid daemon process died, restarting

Here's my virtual host's configuration:

<VirtualHost *:8080>
    ServerName example.com
    ErrorLog /var/log/apache2/error.log

    WSGIScriptAlias / <path to django.wsgi>
    WSGIPassAuthorization on

    WSGIDaemonProcess example.com
    WSGIProcessGroup example.com

    XSendFileAllowAbove on
    XSendFile on 
</VirtualHost>
Mukul
  • 3
  • 3

1 Answers1

0

Neither Apache or mod_wsgi buffers the request. If any buffering is occurring it is in Django or your specific user application code.

Depending on how Python web applications do things, they may naively read the whole request content into memory first before processing. To avoid these problems they should really be streaming the data.

You need to indicate what feature of Django, or other mechanism you are using to handle the upload. Is it standard POST form handler in Django or something else?

Graham Dumpleton
  • 6,090
  • 2
  • 21
  • 19
  • Thanks, it was Django that was buffering the request. The solution I used was to use a FormData object with a file field as the XmlHttpRequest payload, which sends the payload as a multipart form. In the Django handler, I use request.FILES (instead of request), which automatically streams the request to disk if it is large. – Mukul Jul 04 '11 at 20:05