0

I am not able to fetch the worker pod logs on airflow UI. The error I am getting on airflow is:

`

*** Falling back to local log
*** Trying to get logs (last 100 lines) from worker pod aggregationtestcheckingcache-33ea24f45f1344d7a628e21a53b4f6d0 ***

*** Unable to fetch logs from worker pod aggregationtestcheckingcache-33ea24f45f1344d7a628e21a53b4f6d0 ***
(404)
Reason: Not Found
HTTP response headers: HTTPHeaderDict({'Audit-Id': 'b3c492f1-56e5-4fbe-a5b9-ab6c7e01f722', 'Cache-Control': 'no-cache, private', 'Content-Type': 'application/json', 'Date': 'Tue, 22 Nov 2022 20:15:04 GMT', 'Content-Length': '290'})
HTTP response body: b'{"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"pods \\"aggregationtestcheckingcache-33ea24f45f1344d7a628e21a53b4f6d0\\" not found","reason":"NotFound","details":{"name":"aggregationtestcheckingcache-33ea24f45f1344d7a628e21a53b4f6d0","kind":"pods"},"code":404}\n'


`

As soon as a particular dag runs, the logs are visible on the airflow but when it gets success, the logs disapper and the error message is cannot fetch worker log pods. Error:404 Reason:Not found.

But, when the dag is running, I am able to see the pods and the logs on was console.

1 Answers1

0

On the off-chance that you're still experiencing this issue: the most likely culprit here is that you haven't configured persistent logs yet.

If persistent logs aren't configured, then Airflow can only fetch logs from actively running tasks, because the log UI is effectively a frontend for kubectl logs. Once the task completes, the pod gets deleted, and hence the 404 error because the pod it wants to inspect is gone. The Airflow documentation goes into more detail and covers the requirements needed to enable persistent logs.

I believe there's also a configuration setting that tells Airflow not to auto-delete completed tasks, but enabling that setting means you'll need to clean them up manually.

Nathan Strong
  • 2,360
  • 13
  • 17