0

I'm new to scripting and writing a Shell script to copy AWS S3 object from one AWS account bucket to another bucket? A pipeline uploads zip files (.zip) to a bucket. The shell script should only copy the last modified object(file). I know that in shell script, you can recursively find files based on wildcard matching and get last uploaded object from S3 CLI. Is there a more elegant and effective way of doing it?

# Normal way of copying from one bucket to another
aws s3 cp s3://source-bucket/object-v2.2.7.zip s3://destination- bucket/.

# Using the --recursive parameter to copy file
aws s3 ls $source-bucket --recursive | sort | tail -n 1 | awk '{print $4}'

  
get_object = `aws s3 ls $source-bucket --recursive | sort | tail -n 1 | awk '{print $4}'`

aws s3 cp s3://$get_object s3://destination-bucket
echo 'Successfully uploaded file to Destination Bucket'
StarJedi
  • 1,410
  • 3
  • 21
  • 34
  • be aware that listing all objects in a large bucket is a slow and expensive operation, and that s3 is not guaranteed to be read after write consistent. – erik258 Aug 23 '20 at 14:17
  • @DanielFarrell Good to know, do you have any suggestions on how to do this operation more elegantly ? – StarJedi Aug 23 '20 at 14:24

2 Answers2

2

This script finds the last modified object and then copies it.

SOURCE_BUCKET=bucket1
DEST_BUCKET=bucket2

LAST=$(aws s3api list-objects --bucket $BUCKET --query 'sort_by(Contents, &LastModified)[-1].Key' --output text)

aws s3 cp s3://$SOURCE_BUCKET/$LAST s3://$DEST_BUCKET
John Rotenstein
  • 241,921
  • 22
  • 380
  • 470
-1

I guess you can refer here to get last modified objects in S3.

Once you get the object, you can put a loop and inside the loop you can call aws s3 cp command to copy files. Please let me if it works for you.

CK__
  • 1,252
  • 1
  • 11
  • 25