0

EDIT: I have raised the same question on Microsoft Learn Q&A, and there they told me there is no solution as far as December 2022. They opened an internal ticket to solve this directly. My present solution is to write the outputs in the Lake database, and then query it afterwards in the pipeline.

I am working on Azure Synapse Analytics and I have a pipeline in which there is a Spark Job Activity for a Python script. I managed to get input parameters inside the Spark Job by using the sys package (sys.argv) inside the Python script.

I wonder if there is a similar method to return output values from the Spark Job Activity inside the pipeline.

Thank you, Dario

I tried sys.exit(), but if is not the integer 0, then the Spark Job Activity terminates with an error, and it is not what I want.

Dario Ant
  • 1
  • 2
  • hi @Dario Ant, you are running Spark Job Activity in notebook pipeline or ? and what are you expecting ? – B. B. Naga Sai Vamsi Dec 14 '22 at 11:38
  • hi @SaiVamsi, for time reasons I was told to use in Synapse a Spark Job instead of the Notebook, so I am writing down a Python script, adding to a Spark Job e call it in the pipeline. I am expecting something similar to mssparkutils.notebook.exit() or the equivalent in Databricks dbutils.notebook.exit(). However, in a Python script I cannot use the package notebookutils.mssparkutils – Dario Ant Dec 14 '22 at 11:44
  • 'Spark Job instead of the Notebook' can you show configurations of spark job with a picture for better understanding? – Rakesh Govindula Dec 14 '22 at 12:00
  • hi @RakeshGovindula, do you mean that the problem on return variables is related to the Spark configurations and not about Python commands? In that case, what kind of configuration should I need or exclude? – Dario Ant Dec 14 '22 at 12:57
  • @RakeshGovindula I only have the following Spark configurations: spark.sql.legacy.timeParserPolicy=LEGACY, "sql.adaptive=ENABLE" and "sql.adaptive.coalescePartitions=enabled". Are they making conflict with sys.exit()? – Dario Ant Dec 15 '22 at 09:26

0 Answers0