I am trying to run a python script as a celery task with Django. The issue I am having is that the task thinks it is complete as soon as the script begins running. I initially used subprocess.popen() in the tasks.py file, but realized this would mean the task would be complete as soon as the popen() command was issued. I modified my tasks.py code to call a function in my python script, which runs the script; however, this still executes as though the task is immediately complete. I am confused because in flower it says the task is complete, but in the celery log it is outputting the log data defined in the script I am running. I found the following related post. I believe I am following its suggestion to execute a python function from tasks.py.
tasks.py:
def exe(workDir, cancelRun):
sys.path.append(workDir)
import run
if cancelRun=='True':
task_id=exe.request.id
revoke(task_id,terminate=True)
else:
run.runModel(workDir)
task_id=exe.request.id
return task_id
runModel function code:
def runModel(scendir):
fullpath=scendir+'/run.py'
os.chdir(scendir)
p=Process(target=myMain,args=(scendir,))
p.start()
p.join()