2

I have a boto3 script that successfully uploads files to an S3 bucket, using my account's AccessKeyId and SecretAccessKey. This works fine.

But I'm supposed to remove my credentials from this instance and only use the IAM role attached to the instance. I've made various attempts but haven't gotten this to work, usually with:

botocore.exceptions.ClientError: An error occurred (InvalidToken) when 
calling the PutObject operation: The provided token is malformed or 
otherwise invalid.

My code:

!/usr/bin/env python

import datetime
import sys
import os
import ConfigParser
import boto3

s3 = boto3.resource('s3')

config = ConfigParser.ConfigParser()
configfile = config.read('edi.config')

s3bucket = config.get('default', 's3bucket')

s3bucket = s3bucket.strip()

print 's3bucket: ', s3bucket

today = datetime.date.today()

todaystr = today.strftime('%m_%d_%Y')
os.chdir(todaystr)
try:
    os.mkdir('uploaded')
except:
    pass

for f in os.listdir('.'):
    if not os.path.isfile(f):
        continue

print 'uploading', f
data = open(f)

s3.Bucket('ustc-submissions-non-prod').put_object(Key='closewatch/incoming/%s' % f, Body=data)
os.rename(f,'uploaded/%s' % f)

I found a note elsewhere that I need to assume the IAM role within boto3, but (a) I don't have permission to do that and (b) I don't have permission to give myself permission and (c) my colleague thinks this shouldn't be necessary anyway.

Anybody got a complete example of this sort of thing?

Mark McWiggins
  • 621
  • 6
  • 20
  • With no code it's a bit difficult to guess but in all likely hood you need to not pass anything. See [boto3 credentials](http://boto3.readthedocs.io/en/latest/guide/configuration.html) for more information. Your use case is bullet point 8. – stdunbar Oct 04 '17 at 23:52
  • You need to edit your post to include your code (minus the credentials) so we can see where you're falling down. You *definitely* don't need access keys if the instance has the right IAM role. – Miles Erickson Oct 05 '17 at 00:00
  • Can you `aws s3 cp foo.txt s3://bucketname/` at the command line? (`pip install awscli` if the `aws` command is not found) – Miles Erickson Oct 05 '17 at 00:08
  • aws s3 cp grabedi.sh s3://ustc-submissions-nonprod upload failed: ./grabedi.sh to s3://ustc-submissions-nonprod/grabedi.sh An error occurred (InvalidToken) when calling the PutObject operation: The provided token is malformed or otherwise invalid. (This with my credentials removed from .aws/credentials – Mark McWiggins Oct 05 '17 at 00:12
  • What does `aws configure list` show you? – Miles Erickson Oct 05 '17 at 00:20
  • By the way, what value does this custom script provide vs. the use of the `aws s3 sync` command? – Miles Erickson Oct 05 '17 at 00:22
  • aws configure list ` Name Value Type Location ---- ----- ---- -------- profile None None access_key ****************FK2Q iam-role secret_key ****************ajey iam-role region None None ` – Mark McWiggins Oct 05 '17 at 00:22
  • Can we confirm that you're on the current version of `botocore`? `pip freeze | grep botocore`, should be `botocore==1.7.23` as of 2017-10-04. – Miles Erickson Oct 05 '17 at 00:40
  • 1
    Trash this EC2 instance. Launch a new one with an appropriate IAM role (that allows s3:Put* to the bucket in question). Don't try to put any credentials on the instance (do not run aws configure, for example). Install boto3 and your code on the instance. Then run your code. As @stdunbar indicates, you don't need to provide any credentials explicitly. – jarmod Oct 05 '17 at 02:11
  • @jarmod's advice to trash the instance and create a new one is solid. – Miles Erickson Oct 05 '17 at 21:30

3 Answers3

0

Suggestions:

  1. Upgrade and/or reinstall current versions of required packages and dependencies:

    pip install --upgrade --ignore-installed botocore boto3 awscli
    
  2. Verify that aws configure list shows iam-role under Type:

    access_key     ****************DFAB         iam-role
    secret_key     ****************zxQ4         iam-role
    
  3. Simplify your code. If all you want to do is upload a file, you are taking a very long and scenic route! Try this MCVE instead:

    import boto3
    with open('example.txt', 'w') as f:
        f.write('This is an example.')
    s3 = boto3.client('s3')
    s3.upload_file(Filename='example.txt',
                   Bucket='ustc-submissions-non-prod',
                   Key='example.txt')
    
  4. Once your role issue is resolved: if your business need is to synchronize a local directory with an S3 bucket by uploading new files as they come in, consider using the existing aws s3 sync command instead of a custom script.

Miles Erickson
  • 2,574
  • 2
  • 17
  • 18
  • Thanks for the suggestions; I created a new instance with the same role (FullS3AccessToEC2) but when trying the script you list above I get the 'token' error again: oto3.exceptions.S3UploadFailedError: Failed to upload example.txt to ustc-submissions-non-prod/example.txt: An error occurred (InvalidToken) when calling the PutObject operation: The provided token is malformed or otherwise invalid. – Mark McWiggins Oct 05 '17 at 16:24
  • Do you see the same error when running `aws s3 ls` at a bash prompt? – Miles Erickson Oct 05 '17 at 21:29
  • BTW this is with a new instance. I wasn't able to trash the old one because it's in "production" (a test, anyway). – Mark McWiggins Oct 06 '17 at 00:04
  • this screwed up my requirements.txt file. I am getting `RequestsDependencyWarning: urllib3 (1.25.4) or chardet (3.0.4) doesn't match a supported version!` – user9869932 Sep 19 '19 at 19:47
0

It looks like the problem must be a bug in the latest version of boto3.

I tried it in Ruby:

require 'aws-sdk-s3'

s3 = Aws::S3::Resource.new(region:'us-west-2')
obj = s3.bucket('bucket-name').object('key')
obj.upload_file('/path/to/source/file')

This worked using only the IAM role as I needed!

Mark McWiggins
  • 621
  • 6
  • 20
  • I think you now have enough information to open an issue here: https://github.com/boto/boto3/issues – Miles Erickson Oct 11 '17 at 17:01
  • It works with Ruby because you specified the region when instantiated the resource. – emulanob Apr 15 '19 at 14:22
  • I was getting the Token error when I hadn't set up my region with "aws configure" - even though I had an IAM setup for all the other credentials (just hit enter when doing aws configure) I had to specify the proper region. After doing this properly you should have a ~/.aws/configure file with just the region value listed – Matt Mar 16 '21 at 22:56
0

Probably you forgot to specify the AWS region:

session = boto3.session.Session()
self.s3_resource = session.resource(
    "s3",
    region_name="us-east-1", # example
    aws_access_key_id="your_access_key_id",
    aws_secret_access_key="your_secret_access_key",
    aws_session_token="your_session_token",
)
user1315621
  • 3,044
  • 9
  • 42
  • 86