-1

I got a DAG which is using BigQuery operators in some of its tasks. One of such tasks keeps failing, saying that my configuration parameter is actually a set, which cannot be serialised to JSON.

However, I don't understand why I'm getting the error, since I don't see anything that could be turning this dict into a set.

some_task = BigQueryInsertJobOperator(
        task_id='some_task',
        configuration={
            "query": {
                "query": "{% include 'sql/some_sql_script.sql' %}",
                "useLegacySql": False,
            },
            "destinationTable": {
                some_dest_table.ref #I have access to this value.
            }
        },
        params={
            'some_param': some_param.ref,
            'some_other_param': some_other_param.ref,
            'yet_other_param': yet_other_param.ref
        }
    )

When inspecting the rendered template, I can see the config "bleeding" into the actual sql script. i.e

"config={query select foo from bar"

actual error from the logs is TypeError: Object of type set is not JSON serializable

Dasph
  • 420
  • 2
  • 15

1 Answers1

1

Looks like the issue is with destinatonTable field. Drop the curly braces there. Your’e wrapping the value there with curly braces and provide no key - which makes a set

gil
  • 2,388
  • 1
  • 21
  • 29