I have a java code (maven build) to create a json file and upload the file to S3 and after each upload I record a log message with file name. The logs are getting stored in CloudWatch and I am using "awslogs" as the log driver. When I run the code in ECS as a Docker Container the files start uploading to the S3 and for each file I can see a log message but when I stop the container by changing the desired count for the task to 0 then for the last file in S3 I am not able to see the log message otherwise for all other files in S3 there is a log message. What could be the reason for this gap between logs and S3 data ?
What I think is that since the file is successfully uploaded to S3, So the log for that file would have generated but not flushed to the CloudWatch.
As soon as I terminate the container by changing the desired task count to 0 from 1 so what ever file was getting uploaded during that time for that particular file there is no logs present in cloudWatch but the file is present in S3 confirming that program has successfully processed the file. Also for all other files that are present in S3 the logs are available. Why is this hapenning ?