I'm adapting a bash script to python. The script invokes the aws cli to download a file from s3:
aws s3 cp s3://some_bucket/some/key /some/path
I could of course call aws
from python, but it feels better to use a library like boto
to do it; I can customize the operations more and exceptions raised will be more specific than CalledProcessError
, etc. So I wrote:
s3 = boto.connect_s3()
bucket = s3.get_bucket("some_bucket")
key = bucket.get_key("some/key")
contents = key.get_contents_as_string().decode("utf-8").strip()
path = "/some/path"
with open(path, "w") as f:
f.write(contents + "\n")
Although this works on my desktop (which authorizes with access keys), it does not work on the ec2 instance I need it to run on (which uses IAM). Instead, it hangs at the get_bucket()
call. I would think this is due to IAM permissions, but the above aws s3 cp
command works fine. I tried to look through the source of the aws
CLI, but it's quite complicated and it doesn't seem to be using boto
.
What would cause the two to act differently, and is there a way that I can adapt my boto
usage such that it works the same as the CLI?