I'm trying to build up Airflow DAGs that read data from (or write data to) some Google spread sheets. Among the connections in Airflow I've saved a connection of type "Google Cloud Platform" which includes project_id, scopes and on "Keyfile JSON", a dictionary with "type","project_id","private_key_id","private_key","client_email","client_id", "auth_uri","token_uri","auth_provider_x509_cert_url","client_x509_cert_url"
I can connect to the Google Spread Sheet using
cred_dict = ... same as what I saved in Keyfile JSON ...
creds = ServiceAccountCredentials.from_json_keyfile_dict(cred_dict,scope)
client = gspread.authorize(creds)
sheet = client.open(myfile).worksheet(mysheet) # works!
But I would prefer to not write explicitly the key in the code and, instead, import it from Airflow connections.
I'd like to know if there is a solution of the like of
from airflow.hooks.some_hook import get_the_keyfile
conn_id = my_saved_gcp_connection
cred_dict = get_the_keyfile(gcp_conn_id=conn_id)
creds = ServiceAccountCredentials.from_json_keyfile_dict(cred_dict,scope)
client = gspread.authorize(creds)
sheet = client.open(myfile).worksheet(mysheet)
I see there are several hooks to GCP connections https://airflow.apache.org/howto/connection/gcp.html but my little knowledge makes me failing in understanding which one to use and which function (if any) extract the keyfile from the saved connection.
Any suggestion would be greatly welcomed :)