2

My keras model is saved in google storage with model.save(model_name)

I cannot load the model on pydatalab. When I save the model on my local machine, I can just open it with load_model(filepath). Also I did import keras.backend as K, based on NameError when opening Keras model that uses Tensorflow Backend

I have tried the following:

  1. model = load_model(tf.gfile.Open(model_file))
    

Error: TypeError: expected str, bytes or os.PathLike object, not GFile

  1. load_model('gs://mybucket/model.h5')
    

Error: IOError: Unable to open file (unable to open file: name = 'gs://mybucket/model.h5', errno = 2, error message = 'No such file or directory', flags = 0, o_flags = 0)

  1. with file_io.FileIO(model_file, 'r') as f:
    modl = load_model(f)
    

error: TypeError: expected str, bytes or os.PathLike object, not FileIO

Salma R
  • 194
  • 1
  • 12

4 Answers4

8

Load the file from gs storage

from tensorflow.python.lib.io import file_io
model_file = file_io.FileIO('gs://mybucket/model.h5', mode='rb')

Save a temporary copy of the model locally

temp_model_location = './temp_model.h5'
temp_model_file = open(temp_model_location, 'wb')
temp_model_file.write(model_file.read())
temp_model_file.close()
model_file.close()

Load model saved locally

model = load_model(temp_model_location)
  • That's actually a better solution than downloading the file to a local path I guess! – Guilherme Caminha May 03 '18 at 01:09
  • Possible to load entire folder directly, instead of single files, from the bucket? My model takes in a folder (with several files inside) and makes prediction. Everything should be done in memory so can't download to local machine. – Danny Mar 10 '23 at 05:53
4

I don't think Keras supports the TensorFlow file system which in turn knows how to read from GCS.

You could try downloading from GCS to a local path, and then reading from that to load the model.

Nikhil Kothari
  • 5,215
  • 2
  • 22
  • 28
  • Thanks for your answer. That is what I'm currently doing. My workflow is testing code in datalab and training models in google ml engine. It will slow down my process to download models and test data locally. I'm looking for a better way! – Salma R Feb 01 '18 at 22:30
  • Sorry, by "local" I meant relative to where you are loading your model. Not suggesting to download to your physical local desktop/laptop. So instead of passing in a gs:// path to load_model, you would download (possibly using gsutil) the binary to a path on the "local" disk, and then using a normal file path in the call to load_model. Hope that helps. – Nikhil Kothari Feb 02 '18 at 04:29
  • I will try that! Thanks! – Salma R Feb 09 '18 at 00:14
0

The following function works for retraining an already trained keras model (with new data) on gcloud machine learning platform (Thanks to Tíarnán McGrath).

def load_models(model_file):

    model = conv2d_model() #the architecture of my model, not compiled yet
    file_stream = file_io.FileIO(model_file, mode='r')
    temp_model_location = './temp_model.h5'
    temp_model_file = open(temp_model_location, 'wb')
    temp_model_file.write(file_stream.read())
    temp_model_file.close()
    file_stream.close()
    model.load_weights(temp_model_location)

    return model

For some reason, load_model from keras.models does not work for me anymore, so I have to build the model each time.

Stephen Rauch
  • 47,830
  • 31
  • 106
  • 135
Salma R
  • 194
  • 1
  • 12
0

OS level command can also be used just in case someone is using Colab

To mound your google drive use

from google.colab import drive
drive.mount('/content/drive', force_remount=True)

Code to mount GCS

from google.colab import auth
auth.authenticate_user()
project_id = 'thirumalai_bucket'  #your bucket here
!gcloud config set project {project_id}
!gsutil ls

!gsutil -m cp

in your case:

!gsutil -m cp gs://mybucket/model.h5  /content/drive/My\ Drive/models/ 

now the file model.h5 available in drive /content/drive/My Drive/models/ move to your models directory using:

!cd /content/drive/My\ Drive/models/

load_model('model.h5')

Hope this helps!