I am trying to migrate a Python 2.7 App Engine project from NDB to Cloud NDB as part of the migration process to Python 3.
After following the Cloud NDB migration instructions, just running the dev_appserver as before now results in accessing the cloud rather than the local datastore. I see Google's instructions for ensuring one accesses the local data, but I guess I don't understand how to use this in practice.
Assuming I have to use the datastore emulator to prevent this, I run dev_appserver with the flag --support_datastore_emulator true
. This results in a successful conversion of my local datastore data into the sqllite format, but still queries the cloud.
I then set the required environment variables in app.yaml: DATASTORE_DATASET, DATASTORE_PROJECT_ID, DATASTORE_EMULATOR_HOST, DATASTORE_EMULATOR_HOST_PATH, DATASTORE_HOST (the values match the output of gcloud beta emulators datastore env-init
). Running it complains that DATASTORE_APP_ID is not set, so I set it as well.
Everything now launches with a confirmation message that the emulator is being used, but trying to access the datastore results in "BadArgumentError: Could not import googledatastore. This library must be installed with version >= 6.0.0 to use the Cloud Datastore API." After installing that, I get a never-ending series of additional installation requirements and module conflicts... it's a mess, and this isn't listed in the documentation anyway.
How can I get dev_appserver (with or without the datastore emulator) to access local data rather than the cloud? Sadly, I have now spent days trying to make this work.