1

I have 3 files in my s3 bucket name abc.gz, efg.gz and hij.gz , now after an hour same files again come with the same name. I have already downloaded the earlier files , as names are same or can be abc.gz.1 , efg.gz.1 and hij.gz.1. I have used the command "aws s3 cp s3://my-bucket-bang . --recursive" which will download all the files in that directory. I only want new files which are being pushed to my s3 bucket every hour. I am looking for a bash or Linux solution.

Aws time stamp last modified is in this format " Aug 16, 2020 12:49:45 AM GMT+0530 "

how can I compare latest file from the last file and download?

Aditya Verma
  • 201
  • 4
  • 14
  • Why do we allow amazon-s3 to benefit from free community support here? Is Amazon funding StackOverflow in a way? It is a shame that Amazon AWS users are needing help for their closed-source commercial product and this help does not come from Amazon itself. Doesn't Amazon provide support itself instead of abusing community support for free? – Léa Gris Aug 15 '20 at 20:28

1 Answers1

1
aws s3 sync s3://my-bucket-bang/ folder

No need to do fancy time comparisons, just sync the folder with your s3 bucket. Here is some documentation: Selective file download in AWS S3 CLI

benjessop
  • 1,729
  • 1
  • 8
  • 14