0

What is the best way of sharing a bucket between a multi-region App Engine app's projects?

I am enhancing an App Engine project to be multi-regional which means having many projects each set to one region. The existing app has per-user Buckets for uploading files which are created by the back-end as needed, then using resumable downloads / signed URLs etc for the uploads from the user. The permissions are all left as defaults, using the API storage.createBucket(bucketName);. The "App engine Flexible service account" has full access by default, and I've also granted read-access to another service account I made for a cloud function which processes the uploaded files.

Now I've come to need multiple projects accessing the same user-buckets (depending on how I balance the regions), so I need to add permissions for the additional region-projects' two service accounts to create and access these buckets.

I know I could grant access specifically (like this answer https://stackoverflow.com/a/20731155/209288) but as I will have N users, and R regions I would need to add N x R permissions - and need to back-fill in the new permissions each time a new region is added (which would not be often but will happen). I want new user buckets to instantly be able to be used from all regions, and new region-apps to be able to access all previously existing user-buckets.

So I was hoping there's some kind of group functionality I can use to grant group access to all the buckets on creation, then assign the region-app's services accounts into that group. Therefore the matrix of access is combined instantly using this group.

I found there is something for human users: https://cloud.google.com/storage/docs/collaboration#group but I'm not sure if this is appropriate for service accounts.

I looked at access conditions, but it seems to only affect the resources, not the "calling" users.

Another worry I have is that the existing buckets are all in one region/project (just the region I was using when the app was single region), and future buckets will also be created from which ever region/project the user happens to be being served from. If I shut down a region/project I don't want to lose those buckets.

I wondered if I should have a "host" project for these shared buckets, and perhaps a project-level setting in that project could unify the access from the other regional projects? (This is similar to Guillome's idea from this related question of mine)

scipilot
  • 6,681
  • 1
  • 46
  • 65

1 Answers1

0

I have tried one solution which seems fairly simple - using project-level permissions instead of bucket-level.

i.e. to add the "App Engine Default Service Account" from each regional project to the IAM permissions of the cloud storage project (be it the original region or if I did the host project idea).

In my region-project build script I add

gcloud projects add-iam-policy-binding $GCS_IMPORT_PROJECT_ID --member=\'serviceAccount:$PROJECT_ID@appspot.gserviceaccount.com\' --role='roles/storage.admin'

Where GCS_IMPORT_PROJECT_ID is the original or host project, and PROJECT_ID is the new region project being created.

The drawback of this is that it allows the other project to access all buckets, not just the user-upload buckets but infrastructure buckets like cloud-build artefacts, and my app's default and config bucket. This isn't a security issue as it's one app, but could possibly result in a mixup down the line.

scipilot
  • 6,681
  • 1
  • 46
  • 65