1

I have trained a model and I want to deploy it to google cloud platform automatically after it finished training. I can upload it using this code, but I have also found this other which I found more elegant using the tensorflow api.

from tensorflow import gfile

def dump_object(object_to_dump, output_path):
    if not gfile.Exists(output_path):
        gfile.MakeDirs(os.path.dirname(output_path))
    with gfile.Open(output_path, 'w') as wf:
        joblib.dump(object_to_dump, wf)

If I run this on google cloud, it works well. Happy days. But sometimes I want to run this locally let's say for debug locally. Unfortunately if I do that I get a permission denied error:

tensorflow.python.framework.errors_impl.PermissionDeniedError: Error executing an HTTP request: HTTP response code 401 with body '{
  "error": {
    "code": 401,
    "message": "Anonymous caller does not have storage.objects.get access to xxxxxxxx/model_trainer_test.joblib.",
    "errors": [
      {
        "message": "Anonymous caller does not have storage.objects.get access to xxxxxxxx/model_trainer_test.joblib.",
        "domain": "global",
        "reason": "required",
        "locationType": "header",
        "locatio'
     when reading metadata of gs://xxxxxxx/model_trainer_test.joblib

How would I set the permissions so I that the above code works locally as well? NOTE: gsutil works and I have correctly set 'GOOGLE_APPLICATION_CREDENTIALS'

DarioB
  • 1,349
  • 2
  • 21
  • 44
  • Possible duplicate of [How to load a model saved in joblib file from Google Cloud Storage bucket](https://stackoverflow.com/questions/51921142/how-to-load-a-model-saved-in-joblib-file-from-google-cloud-storage-bucket) – Dustin Ingram Aug 13 '19 at 23:06

0 Answers0