0

I am trying to setup Airflow on Managed Apache Airflow. Everything seems to be working fine except for my AWS Redshift connection.

I am using the Connections tab on UI and editing redshift_default with my values. It's working fine locally however when I trigger DAG via online I get following error;

[2023-02-27, 16:15:55 UTC] {{base.py:71}} INFO - Using connection ID 'redshift_default' for task execution.
[2023-02-27, 16:18:07 UTC] {{taskinstance.py:1851}} ERROR - Task failed with exception
Traceback (most recent call last):
  File "/usr/local/airflow/.local/lib/python3.10/site-packages/redshift_connector/core.py", line 585, in __init__
    self._usock.connect((host, port))
TimeoutError: [Errno 110] Connection timed out

Any help will be greatly apprecaited.

AIViz
  • 82
  • 9
  • Have you checked your network routing and security groups? You need to have the Redshift port open in both directions for a connection to be made. – Bill Weiner Feb 27 '23 at 17:33

2 Answers2

1

My connection host was playing. Fixing it rectified the problem.

AIViz
  • 82
  • 9
0

Have you tried increasing the value of the timeout parameter using the extras field? Here are a list of all the extras that are available in a RedshiftConnection object in Airflow. https://github.com/aws/amazon-redshift-python-driver#connection-parameters

There is a timeout parameter than can be altered in the extras

Rafa
  • 141
  • 3