0

Getting error of time-out for the task of SSH while executing the docker image and its commands within it.

[2023-06-14, 16:07:03 UTC] {taskinstance.py:1776} ERROR - Task failed with exception
Traceback (most recent call last):
  File "/home/airflow/.venv/airflow/lib/python3.7/site-packages/airflow/providers/ssh/operators/ssh.py", line 173, in execute
    result = self.run_ssh_client_command(ssh_client, self.command, context=context)
  File "/home/airflow/.venv/airflow/lib/python3.7/site-packages/airflow/providers/ssh/operators/ssh.py", line 159, in run_ssh_client_command
    ssh_client, command, timeout=self.cmd_timeout, environment=self.environment, get_pty=self.get_pty
  File "/home/airflow/.venv/airflow/lib/python3.7/site-packages/airflow/providers/ssh/hooks/ssh.py", line 547, in exec_ssh_client_command
    raise AirflowException("SSH command timed out")
airflow.exceptions.AirflowException: SSH command timed out

How this can be resolved. We have already increased OS level timeout.

1 Answers1

0

I assumed the cmd_timeout is not set on your site.

Currently, I'm using Airflow 2.5.3 with airflow.providers.ssh.operators.ssh version 3.5.0. According to the documentation here, there is a field name cmd_timeout which default to 10 second only. To solve the error, which throwing from SSHHook

  File "/home/airflow/.venv/airflow/lib/python3.7/site-packages/airflow/providers/ssh/hooks/ssh.py", line 547, in exec_ssh_client_command
raise AirflowException("SSH command timed out")

We need to set cmd_timeout=None or cmd_timeout=300 to the SSHHook. E.g.

ssh_hook = SSHHook(ssh_conn_id="ssh", cmd_timeout=None)

This solution works for me, hopefully, it is helpful.

*None means unlimited

Rovern
  • 1
  • 1