2

I'd like to import sql data into my Cloud SQL database (named "db2"). Therefore I have uploaded a sql-file into my bucket. Then I go to my Cloud SQL database and click on "connect with Cloud Shell". When trying to import via CloudShell I get :

info@cloudshell:~ (servers)$ gcloud sql import sql db2 gs://mybucket/db.sql
Data from [gs://mybucket/db.sql] will be imported to [db2].
Do you want to continue (Y/n)?  Y
ERROR: (gcloud.sql.import.sql) HTTPError 403: The service account does not have the required permissions for the bucket.

I have followed the solution given to a somewhat same issue here : https://stackoverflow.com/a/53608811/2795648 But this does not work for me.

The service account of my Cloud SQL database has ALL the permissions I can give on the bucket AND on the file (not that this is neccessary).

Some extra info added. Here are the rights of the service account :

- members:
  - serviceAccount:p3998516xxxxx-p9q5dn@gcp-sa-cloud-sql.iam.gserviceaccount.com
  role: roles/storage.admin
- members:
  - serviceAccount:p3998516xxxxx-p9q5dn@gcp-sa-cloud-sql.iam.gserviceaccount.com
  role: roles/storage.objectAdmin
- members:
  - serviceAccount:p3998516xxxxx-p9q5dn@gcp-sa-cloud-sql.iam.gserviceaccount.com
  role: roles/storage.objectCreator
- members:
  - serviceAccount:p3998516xxxxx-p9q5dn@gcp-sa-cloud-sql.iam.gserviceaccount.com
  role: roles/storage.objectViewer
- members:
  - serviceAccount:p3998516xxxxx-p9q5dn@gcp-sa-cloud-sql.iam.gserviceaccount.com
  role: roles/storagetransfer.admin
- members:
  - serviceAccount:p3998516xxxxx-p9q5dn@gcp-sa-cloud-sql.iam.gserviceaccount.com
  role: roles/storagetransfer.user
- members:
  - serviceAccount:p3998516xxxxx-p9q5dn@gcp-sa-cloud-sql.iam.gserviceaccount.com
  role: roles/storagetransfer.viewer
user2795648
  • 119
  • 1
  • 11

1 Answers1

3

Here you find official documentation about Importing a SQL dump file

Please pay attention to:

3.Describe the instance you are importing to:

gcloud sql instances list    
gcloud sql instances describe [INSTANCE_NAME]

4.Copy the serviceAccountEmailAddress field.

gcloud sql instances describe [INSTANCE_NAME] | grep serviceAccountEmailAddress

5.Use gsutil iam to grant the legacyBucketWriter and objectViewer Cloud IAM roles to the service account for the bucket.

gsutil iam ch serviceAccount:your_previous_service_account_step4:legacyBucketWriter,objectViewer gs://ex-bucket

Then try the import command:

gcloud sql import sql [INSTANCE_NAME] gs://[BUCKET_NAME]/[IMPORT_FILE_NAME] \
                        --database=[DATABASE_NAME]
marian.vladoi
  • 7,663
  • 1
  • 15
  • 29
  • gcloud projects get-iam-policy project-id gives me : - members: - serviceAccount:p399851xxxxxx-p9qxxx@gcp-sa-cloud-sql.iam.gserviceaccount.com role: roles/storage.admin - members: - serviceAccount:p399851xxxxxx-p9qxxx@gcp-sa-cloud-sql.iam.gserviceaccount.com role: roles/storage.objectCreator - members: - serviceAccount:p399851xxxxxx-p9qxxx@gcp-sa-cloud-sql.iam.gserviceaccount.com role: roles/storage.objectViewer Is this not enough then ? Still getting the 403 error. – user2795648 Mar 05 '20 at 15:12
  • Your "gsutil iam ch" command gives error : AccessDeniedException: 403 info@mydomain.be does not have storage.buckets.getIamPolicy access to mybucket. – user2795648 Mar 06 '20 at 13:50
  • Hi, can you check if granting the role of [Storage Admin](https://cloud.google.com/storage/docs/access-control/iam-roles#standard-roles) to info@mydomain.be solves this error? – juferafo Mar 11 '20 at 15:53
  • @marian.valdoi, why is the legacyObjectWriter necessary? Why does it need write access? – Peter Nov 30 '20 at 10:39