101

I'm trying to rename a file in my s3 bucket using python boto3, I couldn't clearly understand the arguments. can someone help me here?

What I'm planing is to copy object to a new object, and then delete the actual object.

I found similar questions here, but I need a solution using boto3.

Flimm
  • 136,138
  • 45
  • 251
  • 267
MikA
  • 5,184
  • 5
  • 33
  • 42

3 Answers3

163

I found another solution

s3 = boto3.resource('s3')
s3.Object('my_bucket','new_file_key').copy_from(CopySource='my_bucket/old_file_key')
s3.Object('my_bucket','old_file_key').delete()
Alan W. Smith
  • 24,647
  • 4
  • 70
  • 96
MikA
  • 5,184
  • 5
  • 33
  • 42
  • 33
    I like this solution, thanks. A tip if anyone has the same stumble as I did. On first read, I missed that the key passed to CopySource *includes the bucket name*. The cool thing is that this means you can copy between buckets, but I was thrown because I was just trying to change a key within the same bucket, and didn't think to prepend it. When I did this, I was getting permissions errors instead of a more sensible NoSuchBucket error. This confused me yet more! Hopefully folks can skip this pitfall now. – t1m0 Jan 08 '16 at 18:42
  • 5
    I used the dictionary format for CopySource (the string format wasn't working for me): `CopySource='string' or {'Bucket': 'string', 'Key': 'string', 'VersionId': 'string'}` based off of http://boto3.readthedocs.io/en/latest/reference/services/s3.html#S3.Client.copy_object – Kurtis Jun 20 '16 at 14:40
  • 4
    Is it possible that the object got deleted before copied? How do we ensure the sequence of the last two statements? – Randy Tang Mar 13 '19 at 06:04
  • the only problem is it will trigger again events if you have them attached to the bucket. – Sławomir Lenart Apr 03 '20 at 09:42
  • isn't this inefficient if you actually just want to move the file (really change pointer location)? Since here we have to actually duplicate the bytes. Is there no way just to change the ref (ie a traditional mv operation)? – hurlbz Jan 29 '21 at 16:51
  • None of these solutions seem to preserve metadata, such as LastModified. – odigity Feb 13 '23 at 22:41
  • How can i rename multiple files inside an S3 bucket:```s3_client = boto3.client("s3") BUCKET_NAME_RAW = bucket_name_raw PREFIX = f"transactions/wcs/dim_promotion/{str_current_date}/" NEW_FILE_NAME = f"dim_promotion_{str_current_datetime}.csv" resp = s3_client.list_objects_v2(Bucket=bucket_name_raw, Prefix=PREFIX) for obj in resp['Contents']: files = obj['Key'] copy_source = {'Bucket' : BUCKET_NAME_RAW, 'Key': files} s3_client.copy_object(Bucket=BUCKET_NAME_RAW, CopySource=copy_source, Key=PREFIX + NEW_FILE_NAME)``` This code renames only first file. Please help. – Jomy Aug 22 '23 at 05:29
73

You cannot rename objects in S3, so as you indicated, you need to copy it to a new name and then deleted the old one:

client.copy_object(Bucket="BucketName", CopySource="BucketName/OriginalName", Key="NewName")
client.delete_object(Bucket="BucketName", Key="OriginalName")
Flimm
  • 136,138
  • 45
  • 251
  • 267
Boto User
  • 731
  • 4
  • 2
  • I'm getting following error: botocore.exceptions.ClientError: An error occurred (NoSuchBucket) when calling the CopyObject operation: The specified bucket does not exist I set Bucket='xyz-abc-yzd' where ''xyz-abc-yzd' is my bucket name is there any convention to be followed while setting bucket name/ key? – MikA Sep 10 '15 at 13:22
  • Same bucket I'm able to list using 'list_buckets()' – MikA Sep 10 '15 at 13:34
  • 9
    It might save some time for other users. the `CopySource` parameter should contain `BucketName` and `KeyName`. So, `OriginalName` is NOT exactly the object key. – Trein Dec 13 '15 at 15:26
  • @Trein - Thanks! I was turning the air blue over here wondering wtf. – Darragh Enright Jan 12 '16 at 11:41
  • 3
    The parameters have changed, as @Trein mentioned. `CopySource` should have the following structure: `{'Bucket': 'string', 'Key': 'string', 'VersionId': 'string'}`. Here's the documentation for reference. http://boto3.readthedocs.io/en/latest/reference/services/s3.html#S3.Client.copy_object – user666 Mar 23 '17 at 18:36
  • 1
    Performance-wise, I've found that `client.copy_object()` is way slower than `client.copy()`. It's because the latter uses multithreading to upload the object in chunks. – Radu Sep 13 '18 at 08:22
19

Following examples from updated Boto3 documentation for the copy() method, which also works with copy_object() and appears to be the required syntax now:

copy_source = {'Bucket': 'source__bucket', 'Key': 'my_folder/my_file'}
s3.copy_object(CopySource = copy_source, Bucket = 'dest_bucket', Key = 'new_folder/my_file')
s3.delete_object(Bucket = 'source_bucket', Key = 'my_folder/my_file')

Note from documentation linked above:

CopySource (dict) -- The name of the source bucket, key name of the source object, and optional version ID of the source object. The dictionary format is: {'Bucket': 'bucket', 'Key': 'key', 'VersionId': 'id'}. Note that the VersionId key is optional and may be omitted.

jpgard
  • 653
  • 7
  • 15
  • 1
    I think you mean `client` instead of `s3` because in the boto3 v1.9.83 `'s3.ServiceResource' object has no attribute 'copy_object'`. Take a look @MikA 's answer, it's using resource to copy – Joe Haddad Jan 23 '19 at 20:45
  • 1
    This worked for me. s3 should be s3 client not resource – Avinash Dalvi Oct 01 '19 at 06:32