7

I am using a AWS Batch job , which triggers an ECR Image ( Docker Image containing python code) and the batch logs to Cloudwatch.

As per the code inside Docker, I am using print command to log as shown below. The issue is all the print statements are shown in the CloudWatch only when the Batch is completed, which takes around 2-3 hours to complete.

The prints are not logged as per code executes, instead all prints are shown only when the whole process completes. Is there a way we can flush the print as soon as the line executes.

Does the CloduWatch/Batch store the logs in memory and flushes only when the job is completed?

Python Code:

print("Process Started..")
#some code 
print("Process Completed.")
Sud
  • 71
  • 5
  • 1
    I see the same thing, the timestamps of the logs are all within 2 seconds of each other, despite the job taking ~10 minutes in my case. – Hugh Feb 04 '22 at 10:24
  • Over [here](https://docs.aws.amazon.com/AmazonCloudWatch/latest/monitoring/CloudWatch-Agent-Configuration-File-Details.html) there is mention of the `force_flush_interval` configuration property for the CloudWatch Logs Agent in `/opt/aws/amazon-cloudwatch-agent/etc/amazon-cloudwatch-agent.json` logs section. Perhaps this can be set / adjusted. – JoeW Jun 25 '22 at 19:40
  • You need to use a logger for logging the output. In java, I use SLF4j to log and it outputs the value while running and can see the logs while job is running for hours or even days . – Aastha Jain Jun 23 '23 at 00:53

0 Answers0