1

I need to execute Process during request like below

@app.route('/test')
def test_process():
    print "starting new process"
    p = Process(target=do_long_extra_job)
    p.start()

    return "this is response"

do_long_extra_job is on another process, so expected work flow is like this

  1. start a process
  2. response
  3. long running extra job finished

but actual flow is like this

  1. stat a process
  2. long running extra job finished
  3. response

how can I response immediately after starting new process?

changhwan
  • 1,000
  • 8
  • 22

1 Answers1

-1

The apply_async method in the multiprocessing.Pool class may work for you. 'Process' won't work for you in this case since it's a blocking method, meaning that your return statement will not execute until p.start() has finished executing.

Example with apply_async and multiprocessing.Pool:

def do_long_extra_job():
    #some extra long process
    return

@app.route('/test')
def test_process():
    print "starting new process"
    pool = multiprocessing.Pool(processes=1)
    pool.apply_async(do_long_extra_job)
    return "this is response" 
  • Thanks for your answer, but I need to add lines like this, `async_result = pool.apply_async(do_long_extra_job)`, `async_result.get()`. I don't know why `get()` is needed but it works. – changhwan Jun 26 '15 at 05:30
  • Ok, It's not working :( Please reject my edit. I knew that `get()` is blocking and waiting results back, but It works first time... or looks like working. – changhwan Jun 26 '15 at 05:47
  • Finally, I find out that `Pool.apply_async` is not working on my production environment setup(using uwsgi / nginx). I need to tell my system admin to install Celery. – changhwan Jun 26 '15 at 06:45