I was working on boto3 module in python and I have had created a bot which would find the publicly accessible buckets, but this is done for a single user with his credentials. I am thinking of advancing the features and make the bot fetch all the publicly accessible buckets throughout every user accounts. I would like to know if this is possible, if yes how, if not why?
-
Do you mean buckets that belongs to your organisation AWS account, or do you mean ANY AWS user public bucket ? For later, it is not possible with boto3. – mootmoot Jan 31 '18 at 15:26
-
I was thinking to go with the latter one. If not boto3 is there another option @mootmoot – Akash Feb 01 '18 at 04:10
2 Answers
Look into the method get_bucket_acl()
.
If a bucket is public, you should see an ACL fro Grantee
-> http://acs.amazonaws.com/groups/global/AllUsers'
Example:
>> client.get_bucket_acl(Bucket='PublicBucket')
....
{'Grantee': {u'Type': 'Group',
'URI': 'http://acs.amazonaws.com/groups/global/AllUsers'},
'Permission': 'READ'}], ...
As you can see AllUsers
global group is allowed READ
access on this bucket.
You might also want to check get_bucket_policy()
and make sure if there is a policy on the bucket, it does not allow public access.
Example:
{
"Version":"2012-10-17",
"Statement":[
{
"Sid":"AddPerm",
"Effect":"Allow",
"Principal": "*",
"Action":["s3:GetObject"],
"Resource":["arn:aws:s3:::examplebucket/*"]
}
]
}
"Principal": "*"
indicates everyone will have access to action S3:GetObject
on examplebucket/*
This article might help as well.

- 2,860
- 19
- 26
-
Of course this would work if I know the names of the public buckets. My problem here is I wanna fetch the names of buckets unknown to me and these buckets are public. – Akash Jan 31 '18 at 10:30
-
@Akash - you can use `list_buckets()` to get all bucket names, and then iterate one by one and check what I mentioned – Eytan Avisror Jan 31 '18 at 10:38
This is not possible.
There is no way to discover the names of all of the millions of buckets that exist. There are known to be at least 2,000,000,000,000 objects stored in S3, a number announced several years ago and probably substantially lower than the real number now. If each bucket had 1,000,000 of those objects, that would mean 2,000,000 buckets to hold them.
You lack both the time and the permission to scan them all, and intuition suggests that AWS Security would start to ask questions, if you tried.

- 169,571
- 25
- 353
- 427