I am planning for a use case wherein which my S3 bucket is used by 10 different users. All these users has separate folders within this bucket, to where they'll store their files. Now I want to know how much storage (and there by the cost) does each of these users use . I know there's cost allocation tag, but that's to be used at the bucket level. Is there anyway that I can get to know this cost, maybe using lambda script or any other way? Any help would really be appreciated.
Asked
Active
Viewed 1,253 times
4
-
3S3 doesn't actually *have* folders. The console and some clients will treat `/` in object keys *like* a folder, but they're not actually folders behind the scenes. – ceejayoz Oct 01 '19 at 19:30
-
@ceejayoz, yes I understand that. I have this use case for some customers, which I am worried how to tackle it. I can't assign them each separate bucket because of the hard limit on bucket numbers within an account. – serverstackqns Oct 02 '19 at 05:05
-
You said your use case is 10 users. AWS permits up to 1,000 buckets in an account. https://docs.aws.amazon.com/AmazonS3/latest/dev/BucketRestrictions.html – ceejayoz Oct 02 '19 at 13:32
-
@ceejayoz, Ok, I have asked you from a real time perspective. If those users are my customers, then there might be case where I could have more than 1000 customers also. So, thats not a possible solution then.. – serverstackqns Oct 02 '19 at 16:30
-
You could use S3's ListObjects API with a prefix; you could store details about the file in a database somewhere when you process the upload; you could just bill them enough to cover an average amount of usage and go after the occasional customer who abuses it by uploading terabytes. Various options are avail. – ceejayoz Oct 02 '19 at 16:31
-
Thanks @ceejayoz. What are the other options, if you have anything in mind? – serverstackqns Oct 02 '19 at 18:47
-
Maybe not exactly a solution, but you can allow object-level CloudTrail on the s3 bucket and then make some reporting based on the logs. – Petr Chloupek Oct 19 '19 at 10:27