0

Hello I have a multilingual transformer model from TensorFlow hub that I want to convert into an ONNX model: (MODEL)

I have tried tf2onnx convert many times and wansn't successful.

Model Signature Def:

    signature_def['__saved_model_init_op']:
  The given SavedModel SignatureDef contains the following input(s):
  The given SavedModel SignatureDef contains the following output(s):
    outputs['__saved_model_init_op'] tensor_info:
        dtype: DT_INVALID
        shape: unknown_rank
        name: NoOp
  Method name is: 

signature_def['serving_default']:
  The given SavedModel SignatureDef contains the following input(s):
    inputs['inputs'] tensor_info:
        dtype: DT_STRING
        shape: (-1)
        name: serving_default_inputs:0
  The given SavedModel SignatureDef contains the following output(s):
    outputs['outputs'] tensor_info:
        dtype: DT_FLOAT
        shape: (-1, 512)
        name: StatefulPartitionedCall_1:0
  Method name is: tensorflow/serving/predict

Concrete Functions:
  Function Name: '__call__'
    Option #1
      Callable with:
        Argument #1
          inputs: TensorSpec(shape=<unknown>, dtype=tf.string, name='inputs')

  Function Name: 'predict'
    Option #1
      Callable with:
        Argument #1
          inputs

What I tried:

!python -m tf2onnx.convert --input /content/TfModel/saved_model.pb --output /content/model.onnx --inputs serving_default_inputs:0 --outputs StatefulPartitionedCall_1:0

Error:

2023-03-28 06:04:31.012968: W tensorflow/compiler/tf2tensorrt/utils/py_utils.cc:38] TF-TRT Warning: Could not find TensorRT
/usr/lib/python3.9/runpy.py:127: RuntimeWarning: 'tf2onnx.convert' found in sys.modules after import of package 'tf2onnx', but prior to execution of 'tf2onnx.convert'; this may result in unpredictable behaviour
  warn(RuntimeWarning(msg))
2023-03-28 06:04:33.554830: E tensorflow/compiler/xla/stream_executor/cuda/cuda_driver.cc:266] failed call to cuInit: CUDA_ERROR_NO_DEVICE: no CUDA-capable device is detected
WARNING:tensorflow:From /usr/local/lib/python3.9/dist-packages/tf2onnx/tf_loader.py:302: convert_variables_to_constants (from tensorflow.python.framework.convert_to_constants) is deprecated and will be removed in a future version.
Instructions for updating:
This API was designed for TensorFlow v1. See https://www.tensorflow.org/guide/migrate for instructions on how to migrate your code to TensorFlow v2.
2023-03-28 06:04:36,607 - WARNING - From /usr/local/lib/python3.9/dist-packages/tf2onnx/tf_loader.py:302: convert_variables_to_constants (from tensorflow.python.framework.convert_to_constants) is deprecated and will be removed in a future version.
Instructions for updating:
This API was designed for TensorFlow v1. See https://www.tensorflow.org/guide/migrate for instructions on how to migrate your code to TensorFlow v2.
WARNING:tensorflow:From /usr/local/lib/python3.9/dist-packages/tensorflow/python/framework/convert_to_constants.py:952: extract_sub_graph (from tensorflow.python.framework.graph_util_impl) is deprecated and will be removed in a future version.
Instructions for updating:
This API was designed for TensorFlow v1. See https://www.tensorflow.org/guide/migrate for instructions on how to migrate your code to TensorFlow v2.
2023-03-28 06:04:36,607 - WARNING - From /usr/local/lib/python3.9/dist-packages/tensorflow/python/framework/convert_to_constants.py:952: extract_sub_graph (from tensorflow.python.framework.graph_util_impl) is deprecated and will be removed in a future version.
Instructions for updating:
This API was designed for TensorFlow v1. See https://www.tensorflow.org/guide/migrate for instructions on how to migrate your code to TensorFlow v2.
Traceback (most recent call last):
  File "/usr/local/lib/python3.9/dist-packages/tensorflow/python/client/session.py", line 1378, in _do_call
    return fn(*args)
  File "/usr/local/lib/python3.9/dist-packages/tensorflow/python/client/session.py", line 1361, in _run_fn
    return self._call_tf_sessionrun(options, feed_dict, fetch_list,
  File "/usr/local/lib/python3.9/dist-packages/tensorflow/python/client/session.py", line 1454, in _call_tf_sessionrun
    return tf_session.TF_SessionRun_wrapper(self._session, options, feed_dict,
tensorflow.python.framework.errors_impl.FailedPreconditionError: Could not find variable Embeddings/sharded_20. This could mean that the variable has been deleted. In TF1, it can also mean the variable is uninitialized. Debug info: container=localhost, status error message=Container localhost does not exist. (Could not find resource: localhost/Embeddings/sharded_20)
     [[{{node Embeddings/sharded_20/Read/ReadVariableOp}}]]

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/lib/python3.9/runpy.py", line 197, in _run_module_as_main
    return _run_code(code, main_globals, None,
  File "/usr/lib/python3.9/runpy.py", line 87, in _run_code
    exec(code, run_globals)
  File "/usr/local/lib/python3.9/dist-packages/tf2onnx/convert.py", line 706, in <module>
    main()
  File "/usr/local/lib/python3.9/dist-packages/tf2onnx/convert.py", line 232, in main
    graph_def, inputs, outputs = tf_loader.from_graphdef(args.graphdef, args.inputs, args.outputs)
  File "/usr/local/lib/python3.9/dist-packages/tf2onnx/tf_loader.py", line 358, in from_graphdef
    frozen_graph = freeze_session(sess, input_names=input_names, output_names=output_names)
  File "/usr/local/lib/python3.9/dist-packages/tf2onnx/tf_loader.py", line 302, in freeze_session
    graph_def = convert_variables_to_constants(sess, graph_def, output_node_names)
  File "/usr/local/lib/python3.9/dist-packages/tensorflow/python/util/deprecation.py", line 371, in new_func
    return func(*args, **kwargs)
  File "/usr/local/lib/python3.9/dist-packages/tensorflow/python/framework/convert_to_constants.py", line 1336, in convert_variables_to_constants
    ret = convert_variables_to_constants_from_session_graph(
  File "/usr/local/lib/python3.9/dist-packages/tensorflow/python/framework/convert_to_constants.py", line 1292, in convert_variables_to_constants_from_session_graph
    converter_data=_SessionConverterData(
  File "/usr/local/lib/python3.9/dist-packages/tensorflow/python/framework/convert_to_constants.py", line 971, in __init__
    converted_tensors = session.run(tensor_names_to_convert)
  File "/usr/local/lib/python3.9/dist-packages/tensorflow/python/client/session.py", line 968, in run
    result = self._run(None, fetches, feed_dict, options_ptr,
  File "/usr/local/lib/python3.9/dist-packages/tensorflow/python/client/session.py", line 1191, in _run
    results = self._do_run(handle, final_targets, final_fetches,
  File "/usr/local/lib/python3.9/dist-packages/tensorflow/python/client/session.py", line 1371, in _do_run
    return self._do_call(_run_fn, feeds, fetches, targets, options,
  File "/usr/local/lib/python3.9/dist-packages/tensorflow/python/client/session.py", line 1397, in _do_call
    raise type(e)(node_def, op, message)  # pylint: disable=no-value-for-parameter
tensorflow.python.framework.errors_impl.FailedPreconditionError: Graph execution error:

Detected at node 'Embeddings/sharded_20/Read/ReadVariableOp' defined at (most recent call last):
Node: 'Embeddings/sharded_20/Read/ReadVariableOp'
Could not find variable Embeddings/sharded_20. This could mean that the variable has been deleted. In TF1, it can also mean the variable is uninitialized. Debug info: container=localhost, status error message=Container localhost does not exist. (Could not find resource: localhost/Embeddings/sharded_20)
     [[{{node Embeddings/sharded_20/Read/ReadVariableOp}}]]

Original stack trace for 'Embeddings/sharded_20/Read/ReadVariableOp':

Another try:

!python -m tf2onnx.convert --saved-model /content/TfModel --output /content/model.onnx --opset 13

Error:

2023-03-28 07:06:17.080759: W tensorflow/compiler/tf2tensorrt/utils/py_utils.cc:38] TF-TRT Warning: Could not find TensorRT
/usr/lib/python3.9/runpy.py:127: RuntimeWarning: 'tf2onnx.convert' found in sys.modules after import of package 'tf2onnx', but prior to execution of 'tf2onnx.convert'; this may result in unpredictable behaviour
  warn(RuntimeWarning(msg))
2023-03-28 07:06:20.722170: E tensorflow/compiler/xla/stream_executor/cuda/cuda_driver.cc:266] failed call to cuInit: CUDA_ERROR_NO_DEVICE: no CUDA-capable device is detected
2023-03-28 07:06:20,723 - WARNING - '--tag' not specified for saved_model. Using --tag serve
2023-03-28 07:06:37,463 - INFO - Signatures found in model: [serving_default].
2023-03-28 07:06:37,464 - WARNING - '--signature_def' not specified, using first signature: serving_default
2023-03-28 07:06:37,485 - INFO - Output names: ['outputs']
^C

I wish to convert the TensorFlow Hub Model into an ONNX model that can further be inferenced and can also be converted into a PyTorch model.

0 Answers0