0

I am trying to create a Storage Transfer job via Terraform as:

resource "google_storage_transfer_job" "goout_storage_backup" {
  description = "my-transfer-job"
  project = "my-project"
  transfer_spec {
    object_conditions {
      max_time_elapsed_since_last_modification = "86400s"
    }
    transfer_options {
      delete_objects_from_source_after_transfer  = false
      delete_objects_unique_in_sink              = false
      overwrite_objects_already_existing_in_sink = true
    }
    gcs_data_source {
      bucket_name = "source"
    }
    gcs_data_sink {
      bucket_name = "target"
    }
  }
  schedule {
    schedule_start_date {
      year  = 1970
      month = 01
      day   = 1
    }
    start_time_of_day {
      hours   = 4
      minutes = 27
      seconds = 0
      nanos   = 42
    }
  }
}

My existing jobs are easily manageable with this script, however Terraform refuses to create any new transfer jobs:

Error: googleapi: Error 400: Failed to obtain the location of the Google Cloud Storage (GCS) bucket source due to insufficient permissions. Please verify that the necessary permissions have been granted., failedPrecondition

However my user is an owner and I also tried to assign Storage Transfer Admin and Storage Admin with no luck.

Vojtěch
  • 11,312
  • 31
  • 103
  • 173
  • Are you running this out of Cloud Build or out of a CLI? Are source and sink buckets in the same project? If they are in different projects are you permissioned on both projects properly? – rk92 Jun 23 '21 at 15:45
  • This happens both in gcloud instance and on my localhost. The buckets are in the same project. – Vojtěch Jun 23 '21 at 19:23
  • Are you able to run a `gsutil cp` command between those buckets to transfer something like a test file? Command example after creating a `file1.txt` would be `gsutil cp gs:///file1.txt gs:///` I also saw that the resource block can take in a `project` input, if you haven't already declared the project where the buckets are can you try to set `project = `. – rk92 Jun 23 '21 at 22:26
  • Yes of course. I can also create and destroy buckets with Terraform. – Vojtěch Jun 24 '21 at 05:14
  • 1
    Just wanted to confirm, so you do have a project defined in that resource block? I didn't see it in your code above. – rk92 Jun 24 '21 at 13:37
  • I added the project field to the code with the same result. – Vojtěch Jun 24 '21 at 14:02
  • 1
    Could you look into this link to see if it help? https://stackoverflow.com/a/53022262/13291468 – rk92 Jun 24 '21 at 15:12
  • Interesting. There seem to be NO transfer service account existing, however the transfers runs every day correctly. I wen through the transfer job settings and there is no configuration of related service account. I tried to google these and it seems it should not even work without its service account. How is this possible? – Vojtěch Jun 25 '21 at 07:26
  • Okay - the service account is not listed in the service account list, however it is assigned to the buckets. Anyway - this helped - it seems that Terraform will not create the permissions for the service account, whereas creating manually in console will. – Vojtěch Jun 25 '21 at 07:33
  • Okay good to hear this was at least partially resolved. So did you fix this by manually adding in the service accounts through the console or did you just manually make a new project? For the project you referenced in your original post was that provisioned with Terraform? Can you show that code and inputs? – rk92 Jun 25 '21 at 14:03
  • 1
    I just manually added the transfer service account to the source and target buckets which fixed this. – Vojtěch Jun 26 '21 at 05:59

1 Answers1

0

When creating a transfer job via Terraform, the target buckets must have assigned the transfer-service-account before. Assigning these fixed the issue.

Vojtěch
  • 11,312
  • 31
  • 103
  • 173