How to upload multiple files to AWS S3?
I tried two ways and both failed:
1)s3cmd shows following error even the file is only 270KB.
$s3cmd put file_2012_07_05_aa.gz s3://file.s3.oregon/
file_2012_07_05_aa.gz -> s3://file.s3.oregon/file_2012_07_05_aa.gz [1 of 1]
45056 of 272006 16% in 1s 25.62 kB/s failed
WARNING: Upload failed: /file_2012_07_05_aa.gz ([Errno 32] Broken pipe)
WARNING: Retrying on lower speed (throttle=0.00)
WARNING: Waiting 3 sec...
2) use boto's S3 interface.
The boto library is working fine for me only when I create bucket using "US Standard", if I choose other regions like Oregon, it will fail and displays "Connection reset by peer"
def connect_to_s3(access_key, secret_key):
conn = S3Connection(access_key, secret_key)
return conn
def percent_cb(complete, total):
sys.stdout.write('.')
sys.stdout.flush()
def upload_to_s3(bucket, file_name):
key = bucket.new_key(file_name)
key.set_contents_from_filename(file_name,cb=percent_cb, num_cb=10)