5

I've created celery tasks to run some various jobs that were written in javascript by way of nodejs. The task is basically a subprocess.popen that invokes nodejs.

The nodejs job will return a non-zero status when exiting, along with error information written to stderr.

When this occurs, I want to take the stderr, and return those as "results" to celery, along with a FAILURE status, that way my jobs monitor can reflect that the job failed.

How can I do this?

This is my task

@app.task
def badcommand():
    try:
       output = subprocess.check_output('ls foobar',stderr=subprocess.STDOUT,shell=True)
       return output
    except subprocess.CalledProcessError as er:
       #What do I do here to return er.output, and set the status to fail?

If I don't catch the subprocess exception, the Job properly fails, but the result is empty, and I get a traceback stacktrace instead.

If I catch the exception, and return er.output the job completed as a success.

Alan
  • 45,915
  • 17
  • 113
  • 134

2 Answers2

8

You can use celery.app.task.Task.update_state method to update the current task state.

@app.task(bind=True)
def badcommand(self):
    try:
       output = subprocess.check_output('ls foobar',stderr=subprocess.STDOUT,shell=True)
       return output
    except subprocess.CalledProcessError as er:
        self.update_state(state='FAILURE', meta={'exc': er})

Note that the bind argument of the app.task decorator was introduced in Celery 3.1. If you're still using a older version, I think you can call the update_state task method this way:

@app.task
def badcommand():
    ...
    except subprocess.CalledProcessError as er:
        badcommand.update_state(state='FAILURE', meta={'exc': er})    
Jeremy Logan
  • 47,151
  • 38
  • 123
  • 143
Balthazar Rouberol
  • 6,822
  • 2
  • 35
  • 41
  • 3
    This doesn't work for me. It appears Celery updates the state to 'SUCCESS' unless you throw an exception or it fails somehow, as determined by Celery, e.g. see https://stackoverflow.com/a/33143545/1175053 – C S Sep 18 '18 at 00:34
  • You can use [Reject()](https://docs.celeryq.dev/en/master/userguide/tasks.html#reject) to return a failure state for the task. The message state for Reject is warning, so you can update the state to FAILURE to generate an error message, and then use Reject to fail the task. – Matts Mar 23 '22 at 05:13
5

You can use a base with specified functions of what to do when failing.

class YourBase(Task):
    def on_success(self, retval, task_id, args, kwargs):
        print "Failure"

    def on_failure(self, exc, task_id, args, kwargs, einfo):
        print "Success"

@app.task(base=YourBase)
def badcommand():
   output = subprocess.check_output('ls foobar', stderr=subprocess.STDOUT, shell=True)
   return output

These are the handlers that your base class can use: http://celery.readthedocs.org/en/latest/userguide/tasks.html#handlers

olofom
  • 6,233
  • 11
  • 37
  • 50
  • How do I deal with subprocess raising the exception, and still trigger the "on_failure" event? – Alan Mar 17 '14 at 15:06
  • The on_failure function should be invoked if the task raises an exception and the on_success function if it doesn't. – olofom Mar 18 '14 at 15:50
  • 2
    Why the down vote? It should solve the problem and it conforms to the guidelines of celery. – olofom Aug 22 '14 at 22:10
  • Because `badcommand()` doesnt raise an exception and thus will be invoking the "on_success" handler. – enrm Nov 02 '20 at 09:52