I have an S3 bucket that holds historical data for customers (in customer- specific directories). Some of this data is in a folder under the customer-specific directories and should be deleted after 30 days. I'm trying to use S3 lifecycle policies to automatically expire and delete these files, but my policy doesn't seem to find them. The general structure of the data I'm trying to find and delete is: /{customer-specific-ID}/logs. I don't want to do through and manually do this for each directory, both because there's a lot of them and because new customer directories are created automatically when we add new customers, so manual configuration fails quickly.
I know it's possible to do this with files in a top-level directory (http://docs.aws.amazon.com/AmazonS3/latest/UG/LifecycleConfiguration.html), but I need to do this for sub-folders. Just setting a rule on the logs prefix at the bucket-level doesn't work, but so far nothing I've been able to find while Googling shows how to target a folder inside all other folders in the bucket.
Is there a way to set a bucket-level lifecycle policy to automatically delete files older than 30 days for all files in {arbitrary-folder}/logs?