0

In doing inference on mobilenet 0.25 model-In face detection task-, I prefer to convert model to ONNX. Then, I used a simple code to do inference like this one:

import onnx
import onnxruntime
import cv2
import numpy as np
import time
import sys
import onnx

from onnxruntime import InferenceSession, RunOptions



def input_output_layer(model_path):
    model = onnx.load(model_path)
    output =[node.name for node in model.graph.output]

    input_all = [node.name for node in model.graph.input]
    input_initializer =  [node.name for node in model.graph.initializer]
    net_feed_input = list(set(input_all)  - set(input_initializer))

    print('Inputs: ', net_feed_input)
    print('Outputs: ', output)
    return net_feed_input, output

model_path = "mnet.25.onnx"
model = onnx.load(model_path)
print(onnx.checker.check_model(model))

sess = InferenceSession(model_path)

for t in sess.get_inputs():
    print("input:", t.name, t.type, t.shape)
for t in sess.get_outputs():
    print("input:", t.name, t.type, t.shape)



img_path = "Face.jpg"

image = cv2.imread(img_path, cv2.IMREAD_COLOR)
img_data = cv2.resize(image, (640, 640)).astype(np.float32)
img_data = np.expand_dims(img_data, 0)
print(f" onnx  shapeee: {np.shape(img_data)}")
img_data = np.transpose(img_data, [0, 3, 1, 2])

print(f" onnx  shapeee: {np.shape(img_data)}, {type(img_data)}")

session_option = onnxruntime.SessionOptions()
session_option.log_severity_level = 4

model = onnxruntime.InferenceSession(model_path, sess_options=session_option,  providers=['CPUExecutionProvider'])

ort_inputs_name, ort_ouputs_names = input_output_layer(model_path)

print(ort_inputs_name, ort_ouputs_names)

start = time.time()
ort_outs = model.run(ort_ouputs_names[0], {ort_inputs_name[0]: img_data.astype('float32')})
outputs = np.array(ort_outs[0]).astype("float32")
print(outputs)

But after running that, I get this error message:

Traceback (most recent call last):
  File "onnx_test_1.py", line 56, in <module>
    ort_outs = model.run(output_x, {ort_inputs_name[0]: img_data.astype('float32')}, None)
  File "/home/mohammad/Documents/insightface/insightface.onnx.env/lib/python3.8/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 200, in run
    return self._sess.run(output_names, input_feed, run_options)
TypeError: run(): incompatible function arguments. The following argument types are supported:
    1. (self: onnxruntime.capi.onnxruntime_pybind11_state.InferenceSession, arg0: List[str], arg1: Dict[str, object], arg2: onnxruntime.capi.onnxruntime_pybind11_state.RunOptions) -> List[object]

Invoked with: <onnxruntime.capi.onnxruntime_pybind11_state.InferenceSession object at 0x7f5e26b0d670>, 'output', {'data': array([[[[184., 184., 184., ..., 167., 167., 167.],
         [184., 184., 184., ..., 167., 167., 167.],
         [183., 183., 184., ..., 169., 169., 169.],
         ...,
         [243., 243., 243., ..., 254., 254., 254.],
         [243., 243., 243., ..., 255., 255., 255.],
         [243., 243., 243., ..., 255., 255., 255.]],

        [[196., 196., 196., ..., 178., 178., 178.],
         [196., 196., 196., ..., 178., 178., 178.],
         [196., 196., 196., ..., 179., 179., 179.],
         ...,
         [251., 251., 251., ..., 254., 255., 255.],
         [251., 251., 251., ..., 255., 255., 255.],
         [251., 251., 251., ..., 255., 255., 255.]],

        [[176., 176., 176., ..., 174., 175., 175.],
         [176., 176., 176., ..., 174., 175., 175.],
         [176., 176., 176., ..., 174., 175., 175.],
         ...,
         [249., 249., 249., ..., 254., 255., 255.],
         [250., 250., 250., ..., 255., 255., 255.],
         [250., 250., 250., ..., 255., 255., 255.]]]], dtype=float32)}, None

It seems that I missed a argument for model.run. But how I set it?

I did inference on Image Quality Assessment models and got proper output. So the main part of code is correct.

Thanks in advance.

BarzanHayati
  • 637
  • 2
  • 9
  • 22
  • 1
    `InferenceSession.run()` takes `List[str]` as its first argument, so the following will work: `ort_outs = model.run([ort_ouputs_names[0]], {ort_inputs_name[0]: img_data.astype('float32')})` – dkim Jan 07 '23 at 14:44
  • That does not work. return self._sess.run(output_names, input_feed, run_options) onnxruntime.capi.onnxruntime_pybind11_state.RuntimeException: [ONNXRuntimeError] : 6 : RUNTIME_EXCEPTION : Non-zero status code returned while running Resize node. Name:'rf_c3_upsampling' Status Message: /onnxruntime_src/onnxruntime/core/providers/cpu/tensor/upsample.cc:1342 onnxruntime::common::Status onnxruntime::Upsample::Compute(onnxruntime::OpKernelContext*) const [with T = float] sizes != nullptr && sizes->Shape().Size() != 0 was false. Either scales or sizes MUST be provided as input. – BarzanHayati Jan 08 '23 at 14:11
  • Are you positive that all Resize nodes in your ONNX model meet the condition in the new error message, `Either scales or sizes MUST be provided as input`? – dkim Jan 08 '23 at 15:32
  • @dkim The third argument or `arg2` should be feed as input for `model.run()` – BarzanHayati Jan 11 '23 at 19:24
  • 1
    The third argument is optional: https://onnxruntime.ai/docs/api/python/api_summary.html#onnxruntime.InferenceSession.run – dkim Jan 12 '23 at 00:43

0 Answers0