0

I am using this python script to upload files that have changed or newly created from local folder to S3 folder.

The script does not work. It just failed at getting bucket name. I am using boto with python2.7. I have googled but couldnt get the answer.

Any help much appreciated.

Many thanks.

Here is the error

Traceback (most recent call last):
 File "s3update.py", line 20, in <module>
 bucket = conn.get_bucket(BUCKET_NAME)
 File "/usr/lib/python2.7/site-packages/boto/s3/connection.py", line 506, in get_bucket
return self.head_bucket(bucket_name, headers=headers)
File "/usr/lib/python2.7/site-packages/boto/s3/connection.py", line 539, in head_bucket
raise err
boto.exception.S3ResponseError: S3ResponseError: 403 Forbidden

Here is the code

#!/usr/bin/env python

# Compare a file on S3 to see if we have the latest version  
# If not, upload it and invalidate CloudFront

import fnmatch
import os
import boto
import pprint
import re
import hashlib
from boto.s3.key import Key

# Where source is checked out
SOURCE_DIR  = '/Downloads/local/folder'
BUCKET_NAME = 's3bucket'

# Connect to S3 and get bucket
conn = boto.connect_s3()
bucket = conn.get_bucket('s3bucket')

# Shortcut to MD5
def get_md5(filename):
 f = open(filename, 'rb')
 m = hashlib.md5()
 while True:
   data = f.read(10240)
   if len(data) == 0:
     break
   m.update(data)
 return m.hexdigest()

def to_uri(filename):
  return re.sub(SOURCE_DIR, '', f)

# Assemble a list of all files from SOURCE_DIR
files = []
for root, dirnames, filenames in os.walk(SOURCE_DIR):
  for filename in filenames:
     files.append(os.path.join(root, filename))

# Compare them to S3 checksums
files_to_upload = []
for f in files:
  uri = to_uri(f)
  key = bucket.get_key(uri)
  if key is None:
  # new file, upload
    files_to_upload.append(f)
  else:
  # check MD5
    md5  = get_md5(f)
    etag = key.etag.strip('"').strip("'")
    if etag != md5:
      print(f + ": " + md5 + " != " + etag)
      files_to_upload.append(f)

  # Upload + invalidate the ones that are different
for f in files_to_upload:
    uri = to_uri(f)
    key = Key(bucket)
    key.key = uri
    key.set_contents_from_filename(f)
    # CloudFront invalidation code goes here
alan pter
  • 41
  • 3
  • 11

2 Answers2

0

You should make sure your bucket name is correct and exists as well as that you have permission to access it.

devdob
  • 1,404
  • 12
  • 13
  • I have edit permission to the bucket. As I can create folders within bucket via FTPclient. – alan pter Nov 11 '17 at 02:00
  • 1
    When i faced this issue this solved my problem, which was a silly permissions issue that I had to specify for AWS. Check out the answer here https://stackoverflow.com/a/10884964/3140312 – devdob Nov 11 '17 at 02:07
0

I had an error similar to this and it was because while my user permissions did allow all access to anything inside the bucket, it did not allow actions at the bucket level, like listing files / objects in it.

For me on the bucket permissions tab after running the policy generator it originally read {

...
        "Action": "s3:*",
        "Resource": "arn:aws:s3:::py-sync-s3/*"
        ]
...

And I had to change it too:

...
        "Action": "s3:*",
        "Resource": [
            "arn:aws:s3:::py-sync-s3",
            "arn:aws:s3:::py-sync-s3/*"
        ]
...

py-sync-s3 being the bucket name.

I know for actions Ideally you would not want to allow all actions with * this was just for testing.