2

So, the Cloud Datastore Admin tools are being deprecated in favor of the Managed Export Import Service. I have like 5 gigs of data in my remote Datastore, and I'd like to be able to import it locally for development and do it in a way that is relatively quickly for an initial dev setup process.

I've run a backup using the Managed Export/Import Service and downloaded it locally, but I haven't been able to successfully import the data. I've tried everything I can find. I'd love to understand the best way to import this data into the new Cloud Datastore Emulator that runs locally. I'm amazed that the documentation on this is so poor.

I'm currently using the remote_api to connect to the remote data, which is fine for some things, but impossible for others. That also uses tons of other remote features like the remote task queue, which I DEFINITELY don't want since I want my local task queue to run while testing.

I've also successfully used the old appcfg.py method, but this is slow and inefficient for new developers to fire up the dev environment.

appcfg.py download_data --application=s~app-name --url=http://app-name.appspot.com/_ah/remote_api/ --filename=backup.csv

Any suggestions?

normmcgarry
  • 661
  • 2
  • 6
  • 22

1 Answers1

1

Here is a script I wrote last year for my own use. So it's a bit primitive:

https://github.com/GAEfan/app_engine_backup_loader

It may need to be updated. Read thru the README and follow the instructions. Let me know if you run into any issues.

GAEfan
  • 11,244
  • 2
  • 17
  • 33