1

I am trying to define a program to receive an HTML request to trigger a process in my server and leave it running there, as a daemon, until another HTML call tells the server to kill it. My problem is leaving the daemon running after a proper HTML response.

I am using a Flask-based python web server. I have tried python's multiprocessing and subprocess modules to no success, typically the main process would finish and kill the subprocess before it could do any good.

In my latest iteration I've tried to combine both, using a multripocessing thread as daemon to start a subprocess but in this case the server never returns a response - although the thread is daemonised, it is waiting for the subprocess to finish (which it never will) and holding the main program from returning...

I am out of ideas... help please?

This is the code:

from flask import Flask, request, abort
app = Flask(__name__)

import sys, time

def f():
  import subprocess as sub
  p = sub.Popen(['/path/to/file.py'])
  print "process created " + str(p.pid) # prints to log

@app.route("/", methods = ['POST', 'GET'])
def home():
  if request.method == "GET":
    # return an HTML form with a 'Login' button
    return """
      <!DOCTYPE html>
      <html>
        <head>
        </head>
        <body>
          Testing Login
          <form action = "/" method = "post">
            <input type="submit" value="Login">
          </form>
        </body>
      </html>
      """

  elif request.method == "POST":
    import sys
    import multiprocessing as mp
    try:
      m = mp.Process(name = 'sub', target = f)
      m.daemon = True
      m.start()
      time.sleep(1) # artifically wait 1 sec for m to trigger subprocess
      return "Logged in!"
    except:
      return "error! <br> " + str(sys.exc_info())
  else:
    abort(401)
Alex
  • 306
  • 2
  • 9
  • ["daemon process" means a rather specific thing](https://pypi.python.org/pypi/python-daemon/). Are you sure you want to start a daemon (why not a simple background process (perhaps `close_fds=True` would be enough))? Why do you use subprocess to run Python code? Have you tried `concurrent.futures`, `celery`? – jfs Jun 09 '15 at 14:42
  • @J.F.Sebastian I need a daemon so that the process lives on once the HTML request is done, otherwise threads are terminated once the HTML call is returned. I dont see how closing file descriptors would help? I am going to look into concurrent.futures and celery now, thanks! – Alex Jun 10 '15 at 08:33
  • A non-daemon process may survive an http request too. Open file descriptors can keep network connections alive. – jfs Jun 11 '15 at 01:53
  • @J.F.Sebastian I checked `concurrent.futures`and it is only available from Python version 3.2 (I am using v2.7, guess I should have said that earlier... sorry!). `celery` seems interesting but I am not looking at setting up a messaging service and it seems to large for the purpose. I also looked at the link to the `python-daemon` module you sent, but that module seems not maintained, lacking documentation and with some outdated dependencies. I did find this one [link](https://pypi.python.org/pypi/daemons) which seems better documented and reasonably maintained. I am going to test that. – Alex Jun 11 '15 at 08:39
  • @J.F.Sebastian Could you please ellaborate on how to keep a process going after the http request (the main/caller process) is done? could you please point me to a library/documentation/link? **Thanks for all your help!** – Alex Jun 11 '15 at 08:40
  • 1. `pip install futures` 2. `celery` works in many cases (if you don't want to use it then your case should be either very simple or very complex) 3. the purpose of the link to `python-daemon` is to define the word `daemon` in this context and to demonstrate how much code is required to do it properly (the latest release is in February 2015; [follow the only link to pep-3143 (I've hoped you do it yourself)](https://www.python.org/dev/peps/pep-3143/#correct-daemon-behavior); I can't remember last time I needed a process that demonizes itself (usually, a supervisor process is used instead)). – jfs Jun 11 '15 at 09:47
  • here's [code example that shows that a non-daemon process may survive an http request](https://gist.github.com/zed/d0c5ab5dc40d1bdd32f7) – jfs Jun 11 '15 at 10:28
  • @J.F.Sebastian please bear with me, I am a relative newbie to all these. I really appreciate your tips and leads. Many thanks! – Alex Jun 12 '15 at 08:30

1 Answers1

0

I am answering my own question to close it in case anyone stumbles upon this. I am not sure how applicable this is to anyone else, as all my issues may be due to my server setup (nginx -> uwsgi -> Flask -> python 2.7)

I tried what @J.F.Sebastian mentioned in the comments; concurrent.futures (the backported version for python 2.7) did not do the job (the process waited for the loop to finish and never returned a response to the HTTP request). celeryI did not test, as it seemed too much for the purpose.

I also tried the daemons module I found, but besides needing a lot of rework to my code, it (for some reason I could not figure out) kills the parent uwsgi process.

Lastly, I tried the close_fds=True argument in the subprocess.Popen, like @J.F.Sebastian suggested, and it works but when I killed the process, it becomes a zombie. I also tried the & argument to tell Linux to work on the background, it didnt change much, only the killing of the process is a bit cleaner.

Hope this helps, if anyone is interested.

Alex
  • 306
  • 2
  • 9