0

When executing a stored procedure or job with the MsSqlOperator, the task always ends OK, even if the stored procedure/job ends with an error.

How can I capture the RC of the execution so that the task status reflects the status of the stored procedure/job?

from airflow.providers.microsoft.mssql.operators.mssql import MsSqlOperator
(...)
sql_query_exec_sp= "EXEC dbo.sp_start_job N'ARSHD_USERS'"
(...)
    run_sp1 = MsSqlOperator(
        task_id='Run_SP1',
        mssql_conn_id='mssql_test_conn_msdb',
        sql=sql_query_exec_sp,
        autocommit=True
(...)

I'm thinking of creating another Task to check if the result of the stored procedure execution were correct but in the future when I had tens of different stored procedure executions it will be unmanageable!

Thanks for all the help you can give me.

marc_s
  • 732,580
  • 175
  • 1,330
  • 1,459
  • Stored procedures and jobs are different things; don't lump them together. If a stored procedure fails (through `RAISERROR` or `THROW` or statement failure) the `EXEC` itself will fail, which is easy to capture. `sp_start_job` on the other hand does exactly what it says on the tin: starts the job, and then goes away. There is no built-in stored procedure for starting a job and waiting for it to finish, nor is it [at all comfortable to write such a thing in T-SQL](https://stackoverflow.com/q/12249056/4137916). A separate script in a *real* programming language would be better. – Jeroen Mostert Nov 15 '22 at 11:16
  • I asked the DB team and it's a job not a stored procedure, my fault. And if I understand what you are saying is that Airflow don't know if it ended ok. I saw the link you send and I'm going to show it to DB guys and see if we find a solution. Thank you for your help. – David Felix Nov 16 '22 at 09:15

0 Answers0