You can also get a list of all objects if multiple files need to be checked. For a given bucket run list_objects_v2
and then iterate through response 'Contents'. For example:
s3_client = boto3.client('s3')
response_contents = s3_client.list_objects_v2(
Bucket='name_of_bucket'
).get('Contents')
you'll get a list of dictionaries like this:
[{'Key': 'path/to/object1', 'LastModified': datetime, 'ETag': '"some etag"', 'Size': 2600, 'StorageClass': 'STANDARD'}, {'Key': 'path/to/object2', 'LastModified': 'datetime', 'ETag': '"some etag"', 'Size': 454, 'StorageClass': 'STANDARD'}, ... ]
Notice that each dictionary in the list contains 'Size' key, which is the size of your particular object. It's iterable
for rc in response_contents:
if rc.get('Key') == 'path/to/file':
print(f"Size: {rc.get('Size')}")
You get sizes for all files you might be interested in:
Size: 2600
Size: 454
Size: 2600
...