When I run a cgi script under Apache any output is rapidly sent to the client. However when I run it under nginx with fcgiwrap it seems nothing is sent to the client until the script either finishes or produces a lot of output. In particular when using git-http-backend this leads to gateway timeouts on cloning large repositories (and lack of progress information on cloning smaller ones).
This behaviour can be seen with the following script.
#!/bin/bash
echo "Content-type: text/html"
echo
while :
do
echo this is a test.
sleep 5
done
Under apache the client will get some data every 5 seconds.
Under nginx with fcgiwrap I get no data and a gateway timeout.
nginx and fcgiwrap are packages from Debian Jessie. Versions 1.1.0-5 and 1.6.2-5+deb8u4
So the questions
- Does anyone know which is responsible for this behaviour? nginx itself? fcgiwrap? both?
- Is it something that can be fixed through configuration?
- If it's a fcgiwrap problem are there alternative cgi wrappers available that don't suffer the problem?