0

I have a trainer application for Cloud ML engine in which I have to use the kms to decrypt my files in Google cloud storage. I'm able to download the files in GCS using tensorflow.python.lib.io.file_io without providing any credentials, as the service account used to launch the training job has access to the GCS bucket from where the encrypted file is being downloaded. However I'm unable to get the default application credentials.

credentials = GoogleCredentials.get_application_default()

The above call returns an empty credential object with most of the fields as null

{"scopes": [], "id_token": null, "kwargs": {}, "token_response": null, "client_id": null, "scope": "", "token_expiry": null, "_class": "AppAssertionCredentials", "refresh_token": null, "_module": "oauth2client.contrib.gce", "_service_account_email": null, "access_token": null, "invalid": false, "token_info_uri": null, "assertion_type": null, "token_uri": "https://www.googleapis.com/oauth2/v4/token", "client_secret": null, "revoke_uri": "https://accounts.google.com/o/oauth2/revoke", "user_agent": null}

I was expecting that the instance provisioned for the training job, by the Cloud ML engine, should have automatically populated my service keys in that instance, but doesn't seem like this is the case.

Any tips on how to get the access credentials.(Apart from including the credentials in the trainer package :) )

Any help would be appreciated.

Fayaz Ahmed
  • 953
  • 1
  • 9
  • 23

1 Answers1

1

I think the returned credentials object only looks empty, but if you call something like


    print credentials.get_access_token()

it will actually generate a valid access token. Or you can use it to authorize HTTP requests or API libraries as described in the Application Default Credentials docs. See also the source code for the list of available methods.

Alexey Surkov
  • 396
  • 2
  • 6