I'm trying to load data from a Postgres table to S3 using Airflow, the PostgresToS3 operator works for all tables except one which is a very large table. The task runs for some time ~800s and then stops without any logs. This seems to because the connection is getting closed after some timeout period. I tried to do a cursor.execute("SET statement_timeout = 0")
before cursor.execute(self.sql)
Is there any way I can fix this?
https://airflow.apache.org/docs/stable/_modules/airflow/hooks/postgres_hook.html