I have a trainer application for Cloud ML engine in which I have to use the kms to decrypt my files in Google cloud storage.
I'm able to download the files in GCS using tensorflow.python.lib.io.file_io
without providing any credentials, as the service account used to launch the training job has access to the GCS bucket from where the encrypted file is being downloaded.
However I'm unable to get the default application credentials.
credentials = GoogleCredentials.get_application_default()
The above call returns an empty credential object with most of the fields as null
{"scopes": [], "id_token": null, "kwargs": {}, "token_response": null, "client_id": null, "scope": "", "token_expiry": null, "_class": "AppAssertionCredentials", "refresh_token": null, "_module": "oauth2client.contrib.gce", "_service_account_email": null, "access_token": null, "invalid": false, "token_info_uri": null, "assertion_type": null, "token_uri": "https://www.googleapis.com/oauth2/v4/token", "client_secret": null, "revoke_uri": "https://accounts.google.com/o/oauth2/revoke", "user_agent": null}
I was expecting that the instance provisioned for the training job, by the Cloud ML engine, should have automatically populated my service keys in that instance, but doesn't seem like this is the case.
Any tips on how to get the access credentials.(Apart from including the credentials in the trainer package :) )
Any help would be appreciated.