1

Openvino throws "RuntimeError: Cannot get dims for non static shape" when an image is passed as numpy array to openvino Core().compile_model(). But the same image when passed after doing a cv2.imread() works fine. How do we pass a numpy array directly to openvino Core().compile_model()?

Working code:

image = cv2.cvtColor(test_img, code=cv2.COLOR_BGR2RGB)
print("image",type(image))
print("model", compiled_model)
image = np.array(image)
# resize to MobileNet image shape 
input_image = cv2.resize(src=image, dsize=(224, 224))

# reshape to network input shape 
input_data = np.expand_dims(np.transpose(input_image, (0, 1, 2)), 0).astype(np.float32)
print(input_data.shape)
print("type:",type(output_layer))
# Do inference
result = compiled_model([input_data])[output_layer]

Buggy code:

        for img in img_batch:
            #image = cv2.cvtColor(img, code=cv2.COLOR_BGR2RGB)
            resized_image = cv2.resize(src=img, dsize=(
            self.node_input.input_resolution.width, self.node_input.input_resolution.height))
            input_img = np.expand_dims(np.transpose(resized_image, (0, 1, 2)), 0).astype(np.float32)
            res = Core().compile_model([input_img])[output_layer]

Any leads on what I am missing here is highly appreciated. Thanks in Advance !!

Jeru Luke
  • 20,118
  • 13
  • 80
  • 87
sri
  • 11
  • 3

1 Answers1

-1

Use Tensor object to hold a copy of the data from the given array.

from openvino.runtime import Tensor
data_float32 = np.ones(shape=(1,3,224,224), dtype=np.float32)
tensor = Tensor(data_float32)

Pass numpy arrays, gathered in either Python dicts or lists.

infer_request = compiled_model.create_infer_request()

# Passing inputs data in form of a dictionary
infer_request.infer(inputs={0: tensor})

# Passing inputs data in form of a list
infer_request.infer(inputs=[tensor])

Obtain the results from inference:

results = infer_request.get_output_tensor().data
Peh_Intel
  • 67
  • 4
  • Answers where the relevant solution is behind a link are considered low quality. Your last few answers look like a response from a help desk script. Even if you are a "Recognized by Intel" user, we expect you to adhere to Stack Overflow's quality standards. Please read this relevant meta post: [Microsoft Azure Customer Engineers Giving Low Quality Answers](https://meta.stackoverflow.com/q/419331) – gre_gor Aug 10 '22 at 06:34