Using Django
Whenever writing a small spreadsheet file, no problem. But when over 700 rows, keep getting 502 Bad gateway. Nginx error log shows that "upstream prematurely closed connection while reading response header from upstream". Based on this, it is indicated that the cause on backend side. However, Django error log file shows nothing.
Current settings in Nginx:
http {
include /etc/nginx/mime.types;
default_type application/octet-stream;
log_format main '$remote_addr - $remote_user [$time_local] "$request" '
'$status $body_bytes_sent "$http_referer" '
'"$http_user_agent" "$http_x_forwarded_for"';
access_log /var/log/nginx/access.log main;
sendfile on;
#tcp_nopush on;
#keepalive_timeout 0;
keepalive_timeout 300s;
proxy_read_timeout 300s;
#gzip on;
# Load config files from the /etc/nginx/conf.d directory
# The default server is in conf.d/default.conf
include /etc/nginx/sites-enabled/*;
}
Gunicorn is being executed with Supervisord. Config lines are:
[program:gunicorn]
command=/home/enki10/.virtualenvs/license-project/bin/python /home/enki10/www/license-project/manage.py run_gunicorn -w 4 -k gevent -t 300
directory=/home/enki10/www/license-project
stdout_logfile=/var/log/gunicorn.log
stderr_logfile=/var/log/gunicorn_error.log
user=www-data
autostart=true
autorestart=true
Updated Here is the code related to creating spreadsheet file:
'''
This columns contains header in excel report
'''
columns = [
(u"Origin", 10000),
(u"Deal", 2000),
(u"Customer", 10000),
(u"Plate", 2500),
(u"Serial", 6000),
(u"Sold", 3000),
(u"Clerk", 3000),
(u"Received", 3000),
]
vehiclesales = VehicleSale.objects.filter(status__lt=AGENT_SUBMITTED_STATUS).order_by('origin', 'sold')
if vehiclesales:
if format == EXPORT_EXCEL:
response = HttpResponse(mimetype='application/ms-excel')
response['Content-Disposition'] = 'attachment; filename=pending_cases.xls'
workbook = xlwt.Workbook(encoding='utf-8')
sheet = workbook.add_sheet("Pending cases")
row_num = 0
font_style = xlwt.XFStyle()
font_style.font.bold = True
for col_num in xrange(len(columns)):
sheet.write(row_num, col_num, columns[col_num][0], font_style)
# set column width
sheet.col(col_num).width = columns[col_num][1]
font_style = xlwt.XFStyle()
font_style.alignment.wrap = 1
#here we go with filling actual data in sheet
for item in vehiclesales:
vehiclesale = vehiclesale_to_dict(item)
row_num += 1
row = [
vehiclesale['origin'],
vehiclesale['deal'],
vehiclesale['customer'],
vehiclesale['plate'],
vehiclesale['vin'],
vehiclesale['sold'],
vehiclesale['clerk'],
vehiclesale['received']
]
for col_num in xrange(len(row)):
if col_num == 5:
font_style.num_format_str = 'MM/dd/yyyy'
else:
font_style.num_format_str = 'general'
sheet.write(row_num, col_num, row[col_num], font_style)
workbook.save(response)
return response
else:
no_report_template_data['report'] = 'Pending cases'
return render_to_response(no_report_template_name, no_report_template_data)
Update finished.
Whether I include the timeout setting when calling gunicorn or not, it does make a difference, same behaviour: always error 502 at 30 seconds-mark
Also noticed, that if the quantity of rows is small (less than 400) but the quantity of columns are like 20, then it also fails. So it does not seem to fail because of rows, but how much data is processed.
Please help.