I have 3 files in my s3 bucket name abc.gz, efg.gz and hij.gz , now after an hour same files again come with the same name. I have already downloaded the earlier files , as names are same or can be abc.gz.1 , efg.gz.1 and hij.gz.1. I have used the command "aws s3 cp s3://my-bucket-bang . --recursive" which will download all the files in that directory. I only want new files which are being pushed to my s3 bucket every hour. I am looking for a bash or Linux solution.
Aws time stamp last modified is in this format " Aug 16, 2020 12:49:45 AM GMT+0530 "
how can I compare latest file from the last file and download?