0

I have an S3 bucket that holds historical data for customers (in customer- specific directories). Some of this data is in a folder under the customer-specific directories and should be deleted after 30 days. I'm trying to use S3 lifecycle policies to automatically expire and delete these files, but my policy doesn't seem to find them. The general structure of the data I'm trying to find and delete is: /{customer-specific-ID}/logs. I don't want to do through and manually do this for each directory, both because there's a lot of them and because new customer directories are created automatically when we add new customers, so manual configuration fails quickly.

I know it's possible to do this with files in a top-level directory (http://docs.aws.amazon.com/AmazonS3/latest/UG/LifecycleConfiguration.html), but I need to do this for sub-folders. Just setting a rule on the logs prefix at the bucket-level doesn't work, but so far nothing I've been able to find while Googling shows how to target a folder inside all other folders in the bucket.

Is there a way to set a bucket-level lifecycle policy to automatically delete files older than 30 days for all files in {arbitrary-folder}/logs?

Eric Hydrick
  • 113
  • 1
  • 7
  • Please edit your question to tell us what you've tried. ie your rules, with paths exactly as you tried them. I just set up a test but it'll take a day or so before anything happens. The docs don't have a leading slash before the folder name. – Tim Apr 18 '17 at 19:36
  • 1
    Possible duplicate of [Most efficient way to batch delete S3 Files](https://serverfault.com/questions/679989/most-efficient-way-to-batch-delete-s3-files) – dannyman Aug 09 '17 at 18:59

0 Answers0