0

I am trying to execute a long-running python 2.7 cgi script asynchronously and return a complete html to the browser so it does not time out (and without waiting for the script to complete)... I am running on Windows XAMPP and abbreviated code is below

My problem is the browser still waits until the entire script is complete... Am I doing something wrong? I've read other similar questions and they comment that adding stdout and stderr arguments might fix the issue, but it has not for me... I also tried setting close_fds=True and eliminate the stdout/sterr arguments and that did not work either... script.py works fine standalone and does not have any output...

Or is there another approach you would recommend? Thank you for any help you can provide!

#!c:\program files\anaconda2\python.exe
import cgi
import subprocess
import sys

subprocess.Popen([sys.executable, 'c:/path/script.py'], stdout=subprocess.PIPE, stderr=subprocess.STDOUT)

print 'Content-type:text/html\r\n\r\n'
print '<html>'
print '<head></head>'
print '<body></body>'
Stanford Wong
  • 339
  • 1
  • 3
  • 13
  • Possible duplicate of [using asyncio to do periodic task in django](https://stackoverflow.com/questions/43838872/using-asyncio-to-do-periodic-task-in-django) – e4c5 May 26 '17 at 15:15
  • Lol, I am not suggesting that you use asyncio. Please do take a moment to read the answer – e4c5 May 26 '17 at 15:24
  • Thank you! I was hoping for something a little more lightweight than celery since it is going to be run very infrequently and setting up all that infrastructure seems a bit much. Is there perhaps another solution you would recommend? – Stanford Wong May 26 '17 at 16:17
  • In that case just put your data into a cache or redis or even a DB (db hihgly unsuitable for frequent tasks) and have a cron job handle it – e4c5 May 27 '17 at 00:49
  • Celery no longer supports Windows as well: http://docs.celeryproject.org/en/latest/faq.html. Is there another alternative besides Celery or would I just use schtasks? – Stanford Wong May 28 '17 at 10:10
  • yes, quite forgt that people sometimes run websites on windows :) Celery 3x still works on it though. Your best bet then would be what I said in my previous comment. Scheduled task + entry in redis/some other queue/db etc – e4c5 May 28 '17 at 11:29

1 Answers1

0

There are some flags to Popen to detach the process from the parent... This allows cgi to "finish" even though the child process is still running.

kwargs = {}
CREATE_NEW_PROCESS_GROUP = 0x00000200  # note: could get it from subprocess
DETACHED_PROCESS = 0x00000008          # 0x8 | 0x200 == 0x208
kwargs.update(creationflags=DETACHED_PROCESS | CREATE_NEW_PROCESS_GROUP)  
subprocess.Popen([sys.executable, 'c:/path/script.py'], close_fds=True, **kwargs)
Stanford Wong
  • 339
  • 1
  • 3
  • 13