0

When running

model = load_model('path_to_model',custom_objects={'my_cutome_activation': my_cutome_activation})

Causes a crash

    layer = deserialize_layer(layer_data, custom_objects=custom_objects)
  File "C:\Users\user\.conda\envs\dl-gpu\lib\site-packages\tensorflow\python\keras\layers\serialization.py", line 177, in deserialize
    printable_module_name='layer')
  File "C:\Users\user\.conda\envs\dl-gpu\lib\site-packages\tensorflow\python\keras\utils\generic_utils.py", line 358, in deserialize_keras_object
    list(custom_objects.items())))
  File "C:\Users\user\.conda\envs\dl-gpu\lib\site-packages\tensorflow\python\keras\layers\core.py", line 1020, in from_config
    config, custom_objects, 'function', 'module', 'function_type')
  File "C:\Users\user\.conda\envs\dl-gpu\lib\site-packages\tensorflow\python\keras\layers\core.py", line 1072, in _parse_function_from_config
    config[func_attr_name], globs=globs)
  File "C:\Users\user\.conda\envs\dl-gpu\lib\site-packages\tensorflow\python\keras\utils\generic_utils.py", line 457, in func_load
    code = marshal.loads(raw_code)
ValueError: bad marshal data (unknown type code)

Tried upgrading and downgrading tensorflow to no avail.

Tried saving the model in the saved-model format which resulted in a different crash:

 File "C:\Users\user\.conda\envs\dl-gpu\lib\site-packages\tensorflow\python\keras\saving\saved_model\load.py", line 385, in _load_layer
    metadata = json_utils.decode(metadata)
  File "C:\Users\user\.conda\envs\dl-gpu\lib\site-packages\tensorflow\python\keras\saving\saved_model\json_utils.py", line 69, in decode
    return json.loads(json_string, object_hook=_decode_helper)
  File "C:\Users\user\.conda\envs\dl-gpu\lib\json\__init__.py", line 361, in loads
    return cls(**kw).decode(s)
  File "C:\Users\user\.conda\envs\dl-gpu\lib\json\decoder.py", line 337, in decode
    obj, end = self.raw_decode(s, idx=_w(s, 0).end())
  File "C:\Users\user\.conda\envs\dl-gpu\lib\json\decoder.py", line 355, in raw_decode
    raise JSONDecodeError("Expecting value", s, err.value) from None
json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0)

Works when building the model instead of loading from hdf5 file.

The model has a custom object as mentioned in the code (custom activation).

YScharf
  • 1,638
  • 15
  • 20

1 Answers1

0

After scanning through the github issue posted by @ferdy in the comments (Thanks @ferdy): Seems to be the following: Running tensorflow by using streamlit run invokes a different serialization library than when running tensorflow in a regular python interpreter. The solution is to write a script to save the model in a streamlit app.

import streamlit

model = ...

model.save('streamlit_saved_model')

Run the streamlit app once to save the model which will invoke the matching serialization library.

streamlit run model_saver.py

Then use this model for the main streamlit app for inference

model_streamlit = load_model('streamlit_saved_model', custom_objects={'my_cutome_activation': my_cutome_activation})

model.predict(...

streamlit run model_predictor.py

YScharf
  • 1,638
  • 15
  • 20