My team has developed some pipelines in Airflow and we are really amazed how can we set multiple tasks to run and data flows from sources directly into our datalake. However, we have some complex tasks and logging can take up to 40 minutes to be shown in the "log by attempts" tab.
We had tested our code in local and all prints and logs always shows as it happens, but in Airflow the dalay is too anoying because we are blind.
I suppose we are missing some configuration but we haven't found yet. Any advices?