0

I have exported a LSTM model from pytorch to onnx . The model takes sequences of length 200. It has hidden state size 256 , number of layers = 2.The forward function takes input size of (batches , sequencelength) as input along with a tuple consisting of hidden state and cell state. I am getting an error while inferencing the model with onnx runtime. hidden state and cell state dimensions are same.

ioio1 = np.random.rand(1,200)
ioio2 = np.zeros((2,1,256),dtype = np.float)
pred = runtime_session.run([output_name],{runtime_session.get_inputs()[0].name:ioio1,
                                          runtime_session.get_inputs()[1].name :ioio2,
                                          runtime_session.get_inputs()[2].name : ioio2})
InvalidArgument                           Traceback (most recent call last)
<ipython-input-204-3928823f661e> in <module>()
      1 pred = runtime_session.run([output_name],{runtime_session.get_inputs()[0].name:ioio1,
      2                                           runtime_session.get_inputs()[1].name :ioio2,
----> 3                                           runtime_session.get_inputs()[2].name : ioio2})

/usr/local/lib/python3.6/dist-packages/onnxruntime/capi/session.py in run(self, output_names, input_feed, run_options)
    109             output_names = [output.name for output in self._outputs_meta]
    110         try:
--> 111             return self._sess.run(output_names, input_feed, run_options)
    112         except C.EPFail as err:
    113             if self._enable_fallback:

InvalidArgument: [ONNXRuntimeError] : 2 : INVALID_ARGUMENT : Unexpected input data type. Actual: (N11onnxruntime17PrimitiveDataTypeIdEE) , expected: (N11onnxruntime17PrimitiveDataTypeIlEE)

Anonymous
  • 31
  • 1
  • 3

1 Answers1

1

This issue is similar : https://github.com/microsoft/onnxruntime/issues/4423 Resolution: ioio1 = np.random.rand(1,200) is float64 (double) , which isn't the dtype your model is expecting.

ashwini
  • 26
  • 1