0

I have some DAGs that use SnowflakeOperator and SnowflakeHook. Both of them are making connections to Snowflake using snowflake_connection input, which I have saved under Admin > Connections in Airflow.

SnowflakeHook(
    snowflake_conn_id="snowflake_connection",
    database='SOME_DB',
    schema='PUBLIC'
)

While this arrangement works inside the Airflow environment, I want to run the code as a regular script in my location machine. But it's unable to access that snowflake_connection, which I have saved inside Airflow. Is there any workaround to this?

Is there a way I can supply a connection string or username/password directly to SnowflakeHook and SnowflakeOperator to make a database connection?

Howard S
  • 121
  • 6

1 Answers1

0

You can mock the connection.


conn = Connection(
    conn_type="gcpssh",
    login="cat",
    host="conn-host",
)
conn_uri = conn.get_uri()
with mock.patch.dict("os.environ", AIRFLOW_CONN_MY_CONN=conn_uri):
    assert "cat" == Connection.get("my_conn").login

For details, see: Mocking variables and connections

mik-laj
  • 541
  • 5
  • 9