0

I am running an Airflow instance using Docker. I am able to access the Airflow UI using http://localhost:8080/. Also able to execute a sample dag using PythonOperator. Using PythonOperator I am able to query a big query table on GCP environment. The service account key JSON file is added in my docker compose yaml file.

enter image description here

enter image description here

This works perfectly.

Now I want to use BigQueryOperator and BigQueryCheckOperator for which I need a connection ID. This connection ID would come from Airflow connections which happens through Airflow UI.

But when I am trying to create a new Google Bigquery connection getting errors. Could anyone please help me to fix this.

enter image description here

enter image description here

sandeep
  • 3,061
  • 11
  • 35
  • 54

1 Answers1

1

In your docker compose file, can you set the environment variable GOOGLE_APPLICATION_CREDENTIALS to /opt/airflow/configs/kairos-aggs-airflow-local-bq-connection.json? This might be enough to fix your first screenshot.

Looking at the docs and comparing your second screenshot, I think you could try selecting 'Google Cloud Platform' as the connection type and adding a project ID and Scopes to the form.

The answers to this question may also be helpful.

Paddy Alton
  • 1,855
  • 1
  • 7
  • 11