0

I was testing remote logging to the s3 bucket path to send airflow task logs. Previously it was working fine. We were able to see logs in airflow UI when there was no environment variable set for remote_logging but after setting logging.remote_logging = True and giving two more values of logging.remote_log_conn_id and logging.remote_base_log_folder as environment variables inside Airflow configuration options of AWS mwaa. We faced this error.

Error:

*** Log file does not exist: /usr/local/airflow/logs/DAG_test/test_mail/2022-11-22T06:42:54.483935+00:00/1.log
*** Fetching from: http://ip-10-192-11-229.us-west-2.compute.internal:8793/log/DAG_test/test_mail/2022-11-22T06:42:54.483935+00:00/1.log
*** Failed to fetch log file from worker. timed out

We have reverted back the environment variable logging.remote_logging = False inside Airflow configuration options of AWS mwaa

enter image description here

still the same issue. Note: I have not removed or reverted the other two environment variable values.

Any reason or help is appreciated.

Shubhank Gupta
  • 705
  • 2
  • 10
  • 27

1 Answers1

0

To help anyone facing the same issue.

It has a permission issue for the S3 bucket for conn_id provided to it via logging.remote_log_conn_id. So, after providing access to the given S3 bucket using logging.remote_base_log_folder, logs were stored and visible on the Airflow UI as well.

Shubhank Gupta
  • 705
  • 2
  • 10
  • 27