64

For example, I have this code:

import boto3

s3 = boto3.resource('s3')

bucket = s3.Bucket('my-bucket-name')

# Does it exist???
helloV
  • 50,176
  • 7
  • 137
  • 145
Daniel
  • 8,212
  • 2
  • 43
  • 36

6 Answers6

89

At the time of this writing there is no high-level way to quickly check whether a bucket exists and you have access to it, but you can make a low-level call to the HeadBucket operation. This is the most inexpensive way to do this check:

from botocore.client import ClientError

try:
    s3.meta.client.head_bucket(Bucket=bucket.name)
except ClientError:
    # The bucket does not exist or you have no access.

Alternatively, you can also call create_bucket repeatedly. The operation is idempotent, so it will either create or just return the existing bucket, which is useful if you are checking existence to know whether you should create the bucket:

bucket = s3.create_bucket(Bucket='my-bucket-name')

As always, be sure to check out the official documentation.

Note: Before the 0.0.7 release, meta was a Python dictionary.

Daniel
  • 8,212
  • 2
  • 43
  • 36
  • is this also the best method for checking existence of objects, to call `head_object()` and handle the error? as opposed to `key in bucket.objects.all()`? (especially if you do not intend to actually `get()` the object?) – Christopher Pearson Jul 24 '15 at 20:39
  • 2
    @ChristopherPearson it is usually better to use `head_object()` because it will do only a single small request while using `bucket.objects.all()` would fetch all of the object information (which may be multiple requests for each page of results) and then looks for the existence of your key in those results. – Daniel Jul 28 '15 at 09:32
  • 11
    Just a small clarification: `create_bucket()` returns a `BucketAlreadyOwnedByYou` error in all AWS regions except US East (N. Virginia) region, while in `us-east-1` region you will get 200 OK. Using `head_object()` is actually the [correct way to go](http://boto3.readthedocs.io/en/latest/guide/migrations3.html#accessing-a-bucket). – lec00q Apr 07 '17 at 11:08
  • 1
    Unfortunately, calling create_bucket on an existing bucket, created in a different region will raise the following exception: `ClientError: An error occurred (BucketAlreadyOwnedByYou) when calling the CreateBucket operation: Your previous request to create the named bucket succeeded and you already own it.` – siesta May 19 '17 at 04:31
  • Direct link to the documentation: http://boto3.readthedocs.io/en/latest/guide/migrations3.html#accessing-a-bucket – Cjkjvfnby Sep 05 '17 at 07:30
36

I've had success with this:

import boto3

s3 = boto3.resource('s3')
bucket = s3.Bucket('my-bucket-name')

if bucket.creation_date:
   print("The bucket exists")
else:
   print("The bucket does not exist")
href_
  • 1,509
  • 16
  • 15
  • 8
    This is the best solution IMO because: 1) it doesn't require ListBuckets which can be expensive; 2) it doesn't require going down to the low-level client API – Oliver Feb 07 '19 at 16:56
  • @ciastek you missed the point. @ Oliver meant you don't have to directly, yourself as the programmer, utilize the low level client to perform this action. Sure, internally it might make the call but that's invisible to the programmer. Please go back and re-read the question and answer. – Urda Jun 19 '19 at 18:14
  • 3
    Do anyone know what is the required minimal permission action to do `s3.Bucket().creation_date`? – Shawn Aug 23 '21 at 08:52
  • 2
    Getting a `creation_date` of `None` for a bucket that exists and that I can upload to. – Fabien Snauwaert Jul 18 '22 at 09:15
  • 2
    P.S.: a good half hour later, `creation_date` is no longer `None`. It was a new bucket, so I take it there's a lag somewhere after bucket creation. – Fabien Snauwaert Jul 18 '22 at 09:49
35

As mentioned by @Daniel, the best way as suggested by Boto3 docs is to use head_bucket()

head_bucket() - This operation is useful to determine if a bucket exists and you have permission to access it.

If you have a small number of buckets, you can use the following:

>>> import boto3
>>> s3 = boto3.resource('s3')
>>> s3.Bucket('Hello') in s3.buckets.all()
False
>>> s3.Bucket('some-docs') in s3.buckets.all()
True
>>> 
helloV
  • 50,176
  • 7
  • 137
  • 145
  • 7
    Yes, this will work assuming you are the bucket owner, however it will call the ListBuckets operation, which is slightly more expensive than a HeadBucket operation. For low call volumes it will cost the same, but if you are checking many buckets it can add up over time! Additionally, the collection creates resource instances after parsing the response while the `head_bucket` call just returns the low-level response without extra processing. – Daniel Nov 13 '14 at 06:18
27

I tried Daniel's example and it was really helpful. Followed up the boto3 documentation and here is my clean test code. I have added a check for '403' error when buckets are private and return a 'Forbidden!' error.

import boto3, botocore
s3 = boto3.resource('s3')
bucket_name = 'some-private-bucket'
#bucket_name = 'bucket-to-check'

bucket = s3.Bucket(bucket_name)
def check_bucket(bucket):
    try:
        s3.meta.client.head_bucket(Bucket=bucket_name)
        print("Bucket Exists!")
        return True
    except botocore.exceptions.ClientError as e:
        # If a client error is thrown, then check that it was a 404 error.
        # If it was a 404 error, then the bucket does not exist.
        error_code = int(e.response['Error']['Code'])
        if error_code == 403:
            print("Private Bucket. Forbidden Access!")
            return True
        elif error_code == 404:
            print("Bucket Does Not Exist!")
            return False

check_bucket(bucket)

Hope this helps some new into boto3 like me.

mondnom
  • 321
  • 3
  • 8
-3

Use lookup Function -> Returns None if bucket Exist

if s3.lookup(bucketName) is None:
    bucket=s3.create_bucket(bucketName) # Bucket Don't Exist
else:
    bucket = s3.get_bucket(bucketName) #Bucket Exist
Roman Marusyk
  • 23,328
  • 24
  • 73
  • 116
-4

you can use conn.get_bucket

from boto.s3.connection import S3Connection
from boto.exception import S3ResponseError    

conn = S3Connection(aws_access_key, aws_secret_key)

try:
    bucket = conn.get_bucket(unique_bucket_name, validate=True)
except S3ResponseError:
    bucket = conn.create_bucket(unique_bucket_name)

quoting the documentation at http://boto.readthedocs.org/en/latest/s3_tut.html

As of Boto v2.25.0, this now performs a HEAD request (less expensive but worse error messages).

vim
  • 845
  • 16
  • 13