Questions tagged [onnxruntime]

ONNX Runtime is a cross-platform inference and training machine-learning accelerator.

See onnxruntime github project.

292 questions
0
votes
1 answer

Read custom metadata from onnx model in C#

When creating an InferenceSession in my C# application I want to access the custom metadata from the .onnx model. I populate the model with metadata in python: model = onnxmltools.load_model("../models/model.onnx") meta =…
0
votes
1 answer

How can i fix Onnxruntime session->Run problem?

I am trying to write a wrapper for onnxruntime. The model receives one tensor as an input and one tensor as an output. During session->Run, a segmentation error occurs inside the onnxruntime library. Both downloaded library and built from source…
Listray
  • 3
  • 1
  • 5
0
votes
1 answer

Error while inferencing a LSTM model with the help of onnx-runtime . Invalid Argument Error

I have exported a LSTM model from pytorch to onnx . The model takes sequences of length 200. It has hidden state size 256 , number of layers = 2.The forward function takes input size of (batches , sequencelength) as input along with a tuple…
Anonymous
  • 31
  • 1
  • 3
0
votes
1 answer

Issues with onnxruntime on Ubuntu 16.04

I'm trying to run inference on an ONNX model on Ubuntu 16.04 using onnxruntime. But the import statement gives me this error: >>> import onnxruntime /opt/conda/lib/python3.6/site-packages/onnxruntime/capi/_pybind_state.py:13: UserWarning: Cannot…
newuser
  • 105
  • 8
0
votes
1 answer

Run inference using ONNX model in python input incompatibility problem?

I am a beginner in programming, I am trying to run the "tinyyolov2-8.onnx" model, I am struggling with the input formating, can anyone suggest how to formate the input for this model. code is given below, import numpy as np from PIL import…
amit waghmare
  • 11
  • 1
  • 2
0
votes
1 answer

Feeding input to resnet18 onnx model (Resnet18 deployment for object detction in video file)

I have built and saved a trained resnet18 model using the code in github in this link the code can be run by specifying the training directory and type of network model. the model resnet18.onnx is chosen and trained to classify 4 types of cells. I…
yahya.k
  • 21
  • 4
0
votes
1 answer

"Can't use matmul on the given tensors" error when converting pytorch to onnx JS

I made a simple pytorch MLP(GAN generator) and converted it to onnx using the tutorial (https://www.youtube.com/watch?v=Vs730jsRgO8), my code is a bit different but I cant catch the error. class Generator(nn.Module): def __init__(self, g_input_dim,…
0
votes
1 answer

Assembly problems with ML ONNX Runtime

I always get the following exception: System.IO.FileLoadException: "Die Datei oder Assembly "Microsoft.ML.OnnxRuntime, Version=0.3.0.0, Culture=neutral, PublicKeyToken=f27f157f0a5b7bb6" oder eine Abhängigkeit davon wurde nicht gefunden. Die…
MLGuy
  • 1
  • 1
0
votes
1 answer

how to solve runtime error while predicting using onnx model?

I have deep learning model trained in matlab using trainNetwork command.I want to use that model in python for predicting, so i exported the network to onnx format in matlab using "exportONNXNetwork" coomand.I imported the onnx model in python using…
0
votes
2 answers

Using ONNX model in C# Windows Form Application

I have trained a deep learning model of Mask RCNN in Keras and have derived a ONNX model (weight matrix) which is able to run and test images in Python successfully. Is there any possibility to use the same ONNX model in Windows form Application, C#…
0
votes
1 answer

Load ONNX Model failed: ShapeInferenceError

In OrtCreateSession it fails trying to load an onnx model with message: failed:[ShapeInferenceError] Attribute pads has incorrect size What does it mean? Where do I look for the problem? Thanks for any ideas.
Tullhead
  • 565
  • 2
  • 7
  • 17
-1
votes
1 answer

Scikit SVM execution in with onnx in Android

I have trained a SVM in Python with scikit and used probabilities=true model = svm.SVC(gamma=params["gamma"], C=params["C"], probability=True) model.fit(X_train, y_train) initial_type = [('float_input', FloatTensorType([None,…
-1
votes
1 answer

How can I extract and print the metadata of a onnx model in c++?

This in the c++ code where I try to extract and print the metadata of a ONNX model. Loading the onnx model works perfectly but I can't extract the model's metadata. I am using ort version 1.14.0 ` // Load the ONNX model Ort::SessionOptions…
bb45678
  • 1
  • 1
-1
votes
1 answer

using onnx runtime time to run inference on keras converted onnx model

I am attempting to run inference on my .onnx model converted from a keras' multi-label text classification model using https://keras.io/examples/nlp/multi_label_classification/. This is a text classification model that takes in text and provides a…
lm231
  • 29
  • 6
-1
votes
1 answer

Onnx inference throws an error with numpy float32 datatype inside the streamlit framework

In one of datascience web app project, I designed an app to predict the type of plant disease. It contains onnx models. The prediction runs without an error standalone. But inside the streamlit code, it raises an error: UFuncTypeError: ufunc…
tad
  • 117
  • 7
1 2 3
19
20