13

My team has a requirement that we be able to retrieve a backup of our database (hosted on Google Cloud SQL) and restore that database to a locally hosted instance of MySQL.

I know that Google Cloud SQL has the ability to schedule backups, but these don't appear to be available to download anywhere.

I also know that we are able to "export" our database to Google Cloud Storage, but we'd like to be able to schedule the "export".

The end goal here is to execute the following steps in some sort of an admin script:

  1. Automatically backup our database that is hosted on Google Cloud SQL.
  2. Download the backup to a local (not cloud) server.
  3. Restore backup to a local instance of MySQL.

Any ideas?

Tombatron
  • 1,567
  • 1
  • 13
  • 27

5 Answers5

13

gcloud sdk commands now provide import/export functionality:

gcloud sql export sql <DATABASE_INSTANCE> \
    gs://<CLOUD_STORAGE_BUCKET>/cloudsql/export.sql.gz \
    --database <DATABASE_NAME>

This export can be downloaded using gsutil. It can also be imported using mysqlimport

jmwicks
  • 542
  • 5
  • 9
  • 1
    You also need to grant the service account for the sql instance (`gcloud sql instances describe | grep -i account`) the `Storage Object Creator` permission so it can create the dump file in the bucket. – quickshiftin May 11 '18 at 13:24
  • 3
    This command is deprecated and will be removed in version 205.0.0. Use [gcloud sql export sql](https://cloud.google.com/sdk/gcloud/reference/sql/export/sql) instead – riotera Mar 26 '19 at 20:34
6

That's the problem I've encountered and my solution was:

  1. Go to IAM Service Accounts Management
  2. Create a new Service account (I called it sql-backuper), download access key for it in JSON
  3. Grant Viewer, Storage Object Creator roles to it on main IAM page (currently GCloud doesn't have a separate read-only role for SQL)
  4. Set it up on the machine that will do backups: gcloud auth activate-service-account sql-backuper@project-name-123.iam.gserviceaccount.com --key-file /home/backuper/gcloud-service-account.json (gcloud auth documentation)
  5. Create a new bucket at GCloud Storage Browser
  6. Now on your backup machine you can run: gcloud sql instances export [sql-instance-name] gs://[bucket-name]/[file-name].gz --database [your-db-name] (gcloud sql documentation) and gsutil cp gs://[bucket-name]/[file-name].gz [local-file-name].gz (gsutil cp documentation)
  7. You've got a local DB copy which you can now use as you want
Aldekein
  • 3,538
  • 2
  • 29
  • 33
3

Note that you can now trigger an Export operation using the Cloud SQL REST API.

So your admin script can do that and then download the backup from Cloud Storage (You'll need to wait until the export operation finishes though).

2

Sorry, but Cloud SQL does not have this functionality currently. We'd like to make this easier in the future. In the meantime, you could use Selenium (or some other UI scripting framework) in combination with a cron job.

Ken Ashcraft
  • 559
  • 3
  • 8
  • Went ahead and marked this as the answer because, it is. But this seems like a pretty big hole in functionality. – Tombatron Jan 03 '13 at 03:12
2

If you want to download a backup (manual or automated), you can launch another CloudSQL instance and then:

  1. Click on the backup options (restore) from the backuped instance
  2. Choose to restore on the previously launched instance
  3. Export data from your newly restored CloudSQL
  4. Get the .sql or .csv on Cloud Storage
Yanc0
  • 201
  • 3
  • 10