0

I have this link:

s3://some_path/200_files/*.gz

I have the corresponding ACCESS ID and SECRET KEY. How to copy the complete folder (200_files) OR all the .gz to the local system? Ubuntu CLI or Python based solution. I understand that this is not a up to the mark question, answers in comments would work. Thanks :)

Rakmo
  • 1,926
  • 3
  • 19
  • 37

1 Answers1

1

To copy all objects in an S3 bucket to your local machine simply use the aws s3 cp command with the --recursive option.

See: http://bigdatums.net/2016/09/04/copy-all-files-in-s3-bucket-to-local-with-aws-cli/

To set the credentials:

mkdir ~/.aws
touch credentials

~/.aws/credentials (sample content)

[default]
aws_access_key_id=AKIAIOSFODNN7EXAMPLE
aws_secret_access_key=wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY

More config here

Rakmo
  • 1,926
  • 3
  • 19
  • 37
David
  • 7,652
  • 21
  • 60
  • 98
  • Thanks. I had already gone through that link, I just thought that is meant for AWS shell, within the cloud. Should work, will let you know. – Rakmo Jun 24 '18 at 10:23
  • How do I include credentials? Do I need to set them as environment variables? – Rakmo Jun 24 '18 at 10:27
  • 1
    You need to configure you AWS client. Instructions here: https://docs.aws.amazon.com/cli/latest/userguide/cli-config-files.html Environment variables are also an option (https://docs.aws.amazon.com/cli/latest/userguide/cli-environment.html) – David Jun 24 '18 at 10:29
  • Thanks. For those who follow up with this link, you need to manually create **~/.aws** directory and the **credentials** file within it, with the given content in the link. – Rakmo Jun 24 '18 at 10:35