29

I am trying to upload multiple files from my local to an AWS S3 bucket,
I am able to use aws s3 cp to copy files one by one,
But I need to upload multiple but not all ie. selective files to the same S3 folder,
Is it possible to do this in a single AWS CLI call, if so how?

Eg -

aws s3 cp test.txt s3://mybucket/test.txt

Reference -
https://docs.aws.amazon.com/cli/latest/reference/s3/cp.html

desertnaut
  • 57,590
  • 26
  • 140
  • 166
Dev1ce
  • 5,390
  • 17
  • 90
  • 150

5 Answers5

44

If you scroll down the documentation link you provided to the section entitled "Recursively copying local files to S3", you will see the following:

When passed with the parameter --recursive, the following cp command recursively copies all files under a specified directory to a specified bucket and prefix while excluding some files by using an --exclude parameter. In this example, the directory myDir has the files test1.txt and test2.jpg

So, assuming you wanted to copy all .txt files in some subfolder to the same bucket in S3, you could try something like:

aws s3 cp yourSubFolder s3://mybucket/ --recursive

If there are any other files in this subfolder, you need to add the --exclude and --include parameters (otherwise all files will be uploaded):

aws s3 cp yourSubFolder s3://mybucket/ --recursive --exclude "*" --include "*.txt"
george007
  • 609
  • 1
  • 7
  • 18
Tim Biegeleisen
  • 502,043
  • 27
  • 286
  • 360
2

If you're doing this from bash, then you can use this pattern as well:

for f in *.png; do aws s3 cp $f s3://my/dest; done

You would of course customize *.png to be your glob pattern, and the s3 destination.

If you have a weird set of files you can do something like put their names in a text file, call it filenames.txt and then:

for f in `cat filenames.txt`; do ... (same as above) ...
Tyler
  • 28,498
  • 11
  • 90
  • 106
0
aws s3 cp <your directory path> s3://<your bucket name>/ --recursive --exclude "*.jpg"  --include "*.log”
George Smith
  • 438
  • 4
  • 8
0

If you use s3cmd (rather than the aws CLI), you can specify as many source files as you want.

For example, to upload all *.iso files in the current directory:

$ s3cmd put --no-preserve --multipart-chunk-size-mb=50 *.iso "s3://my-bucket/backups/"

Also, s3cmd handles uploads of large files well, since it automatically retries individual parts that temporarily fail to upload.

David Foster
  • 6,931
  • 4
  • 41
  • 42
0

Easy way when you can't copy file from local to cluster:-

Step 1. Add a file using nano or copy from your local to the cluster

ex.

>> nano test_folder/abc.txt

CTRL + X -> Enter Yes and it will exit

Step 2. Type pwd in command prompt and get the file location of the source

>>pwd 
>>home/mynamehpb

Then location will be home/mynamehpb/test_folder/abc.txt

Step 3 Use aws cp command and copy the file to the destination

aws s3 cp /home/mynamehpb/test_folder/abc.txt s3://company-bucket/folder/abc.txt

Step 4 If encryption key required, use sse command.

aws s3 cp /home/mynamehpb/test_folder/abc.txt s3://company-bucket/folder/abc.txt -sse ABC123

This should do the job when you are unable to copy file from local to cluster.

Hari_pb
  • 7,088
  • 3
  • 45
  • 53