Questions tagged [onnx]

ONNX is an open format to represent deep learning models and enable interoperability between different frameworks.

ONNX

The Open Neural Network Exchange (ONNX) is an open-source artificial intelligence ecosystem. With ONNX, AI developers can more easily move models between state-of-the-art tools and choose the combination that is best for them. ONNX is widely supported and can be found in many frameworks, tools, and hardware. It is developed and supported by a community of partners.

Official resources

809 questions
2
votes
0 answers

How to quantize an ONNX model converted from a XGBoost classifier model?

I converted a XGBoost classifier model to an ONNX model by onnxmltools and quantized the ONNX model using ONNX quantize_dynamic(). But I didn't get a quantized ONNX model with smaller model file size or faster inference time. I used Anaconda3,…
SC Chen
  • 23
  • 5
2
votes
1 answer

onnxruntime: Given model could not be parsed while creating inference session. Error message: Protobuf parsing failed

According to the example code mentioned below the library. I have followed the example code but it didn't work. [Library] https://github.com/notAI-tech/NudeNet/ Code from nudenet import NudeClassifier import onnxruntime classifier =…
Khawar Islam
  • 2,556
  • 2
  • 34
  • 56
2
votes
0 answers

Converting Pth to onnx - convolution_mode error

I am trying to convert .pth to .onnx file, class Net(nn.Module): def __init__(self): super(Net, self).__init__() cnn = nn.Sequential() cnn.add_module('c1', nn.Conv2d(3, 32, 3, 1, 1)) cnn.add_module('r1',…
Vishak Raj
  • 141
  • 1
  • 1
  • 8
2
votes
1 answer

Couldn't convert pytorch model to ONNX

I used this repo : https://github.com/Turoad/lanedet to convert a pytorch model that use mobilenetv2 as backbone To ONNX but I didn't succeeded. i got a Runtime error that says: RuntimeError: Exporting the operator eye to ONNX opset version 12…
2
votes
0 answers

How to convert SimpleTransformers' ONNX model to keras

I am currently trying to convert the onnx model using onnx2keras. However, I am facing the error below: AttributeError Traceback (most recent call last) in () 6 7…
FND FYP
  • 51
  • 4
2
votes
1 answer

How to merge Pre-post processing of ML model into ONNX format

Simply inside the model should pre-processing be done; for inference, the user should only give the image path. Inside the onnx model, colour conversion and picture resizing will be performed. Please provide suggestions. # Preprocessing of ONNX…
Imran_Say
  • 132
  • 10
2
votes
1 answer

Unable to load .pb while converting PyTorch model to tf.keras

Context I'm using tf.keras for a personal project and I need to retrieve a pretrained Alexnet model. Unfortunately, this model is not directly accessible using tf.keras only, so I downloaded the pretrained model using PyTorch, converted it into an…
Arthur
  • 138
  • 7
2
votes
0 answers

Making prediction from encoder and decoder of T5 model without using generate method

I was working on the optimization of the T5 model I separated the model into encoder and decoder and converted them to ONNX using Nvidia TensorRT repo https://github.com/NVIDIA/TensorRT/tree/main/demo/HuggingFace but I am unable to make an…
2
votes
0 answers

How can I combine a Huggingface tokenizer and a BERT-based model in onnx?

Problem description: I have a model based on BERT, with a classifier layer on top. I want to export it to ONNX, but to avoid issues on the side of the 'user' of the onnx model, I want to export the entire pipeline, including tokenization, as a ONNX…
2
votes
1 answer

RuntimeError: Only tuples, lists and Variables are supported as JIT inputs/outputs. Dictionaries and strings are also accepted

I was trying to convert my pytorch model to onnx but I am facing RuntimeError: Only tuples, lists and Variables are supported as JIT inputs/outputs. Dictionaries and strings are also accepted, but their usage is not recommended. Here, received an…
user16668992
2
votes
0 answers

How to covert Fastspeech2 to Onnx with dynamic input and output?

How Can I get dynamic input in torch model to onnx model ? I give input with dynamic_axes, but the output in inference is not dynamic. My code: input_names = ['speakers', 'texts', 'src_lens', 'max_src_len'] output_names = ['output',…
Frank.Fan
  • 929
  • 3
  • 14
  • 23
2
votes
1 answer

Export tensorflow model to ONNX and specify variable names

I have a tensorflow model written through model subclassing and I want to export it to ONNX format. This is simple enough with the script attached. However, the name of the input variable to the ONNX model is args_0. How can I rename it? import…
Gianluca Micchi
  • 1,584
  • 15
  • 32
2
votes
1 answer

How to load an ONNX file and use it to make a ML prediction in PyTorch?

Below is the source code, I use to load a .pth file and do a multi-class image classification prediction. model = Classifier() # The Model Class. model.load_state_dict(torch.load('.pth')) model = model.to(device) model.eval() #…
Hari Krishnan U
  • 166
  • 5
  • 16
2
votes
0 answers

Saving darts time series model to onnx format

Below is a sample of darts time series model. How can I save this model to onnx model? Is there any way to do it? Thank you. import pandas as pd import numpy as np import darts from darts import TimeSeries from darts.models import NBEATSModel…
melik
  • 1,268
  • 3
  • 21
  • 42
2
votes
0 answers

NoSuchFile: [ONNXRuntimeError] : 3 : NO_SUCHFILE : Load model from onnx/bert-base-cased/model.onnx failed:Load model onnx/bert-base-cased/model.onnx

Goal: to successfully save and load in a HuggingFace NLP model. Kernel: conda_pytorch_p36. I performed Restart & Run All, and refreshed file view in working directory. I'm following along with this code tutorial, the first Python code…
DanielBell99
  • 896
  • 5
  • 25
  • 57