ONNX Runtime is a cross-platform inference and training machine-learning accelerator.
Questions tagged [onnxruntime]
292 questions
0
votes
1 answer
How to include an external file on Android using C# and Godot?
I am trying to export a Godot game to Android using C#. I am using external libraries, like onnxruntime and everything seems to work except for the fact that I cannot include custom files to the exported package.
I have already tried to include…

J. Fenigan
- 119
- 2
- 9
0
votes
0 answers
how to use onnx runtime in a custom jupyter ipywidget
I try to use onnx runtime in a custom ipywidget but i have a problem with the *wasm files needed by the wasm backend.
I start from the template made by cookie cutter ts and then add the onnx example code provided by onnx :
So here my widget.ts code…
0
votes
1 answer
Why do my ONNXRuntime Inference crash on GPU without any log?
I am trying to run a ONNX model in C# created with pytorch in Python for image segmentation. Everything works fine when I run it on CPU but when I try to use the GPU my application crash when trying to run the inference.
(Everything works fine when…

Léo
- 53
- 1
- 8
0
votes
1 answer
Unable to import onnx in Java
I haven't worked with Java much, and I need to load a model trained in Python and check if I can make inference in Java or not. I am trying to load an onnx file in Java. To do this I am importing onnx in Java, but it's throwing an error that package…

Kaushal
- 1
- 4
0
votes
0 answers
Improving Inference for BigBirdForSequenceClassification
Any pointers on how to improve Inference for BigBird finetuned on Multiclass Classification? Inference is done on 16GB GPU(NVIDIA).
I have already tried Deepspeed and ONNX. ONNX Runtime is not supported for Bigbird and Deepspeed Zero Stage 3 doesn't…

Salman Baqri
- 99
- 3
- 15
0
votes
0 answers
Onnxruntime conversion error from Pytorch model to onnx
I have the follow code to convert from pytorch model to onnx.
#Function to Convert to ONNX
import argparse
import io
import numpy as np
from torch import nn
import torch.utils.model_zoo as model_zoo
import torch.onnx
from torchvision import…

batuman
- 7,066
- 26
- 107
- 229
0
votes
0 answers
ONNX Runtime and CWaitCursor on Windows
I'm running a lengthy algorithm on Windows 10, written in MS Visual C++. A portion of the algorithm is inferencing an ONNX model using ORT. I want to spin the wait cursor so the user knows the algorithm is running and not done yet. It spins…

Tullhead
- 565
- 2
- 7
- 17
0
votes
1 answer
ONNX Runtime memory arena, reuse, and pattern
As described in Python API Doc, there are some params in onnxruntime session options coressponding to memory configurations such as:
enable_cpu_mem_arena
enable_mem_usage
enable_mem_pattern
There are some descriptions for them but I can not…

Mohsen Mahmoodzadeh
- 111
- 1
- 10
0
votes
1 answer
onnx runtime without ML.NET
I am trying to run an ONNX inference session in c# in a system that can only run .net framework 4.8.Unfortunetely,framework 4.8 can not run ML.NET and upgrading is not an option. Are there any tricks or workaround that I can get ONNX runtime to…

user101464
- 35
- 7
0
votes
2 answers
Using a package from packages.config in C++ visual studio 2017
I am developing a code in C++ using as IDEE Visual Studio 2017 on my Windows 10 workstation. I need the onnxruntime library, so I have installed it by the NuGet package menager. The installation went ok and in my solution I have a folder Resource…

Jonny_92
- 42
- 9
0
votes
0 answers
Why does this conversion of an onnx model to tensorflow using onnx-tf fail?
I am working on converting an existing Pytorch model into a tf-lite model to see if it can speed up inference time and make it easier to deploy to mobile.
Planned steps:
convert pytorch to onnx using torch.onnx.export().
convert the onnx model to…

user1884325
- 2,530
- 1
- 30
- 49
0
votes
1 answer
ONNX Runtime: Can same ONNX model be inferenced in both a 32-bit App and a 64-bit App?
An ONNX model was supplied to me. I wrote code to inference it via a C++ 64-bit (x64) test program. Works great. But I also need this same model to run in a C++ 32-bit (x86) program -- cannot get it to run there! I can 'load' the model, I can…

Tullhead
- 565
- 2
- 7
- 17
0
votes
0 answers
'NoneType' object has no attribute 'shape' when using convert_sparkml function and FloatTensorType from onnxmltools library
I searched a lot for my problem but to no avail.
This is the runtime version I have on my Databricks cluster: 10.4 LTS ML (includes Apache Spark 3.2.1, Scala 2.12)
I have this code below:
with mlflow.start_run():
rf =…

eLMagnifico
- 37
- 7
0
votes
0 answers
Onnxruntime linking on Android x86
I'm building crossplatform shared library with onnxruntime dependency as static library. I have built onnxruntime from official repo. After that I created empty project and linked onnxruntime to it via CMake:
add_library(mylib SHARED…

Egor
- 107
- 1
- 8
0
votes
0 answers
How to get class and bounding box coordinates from YOLOv5 onnx predictions?
I'm new in onnx. How I can convert yolov5 to onnx and get bbox?
class YOLOv5ONNX:
def __init__(self, model_file: str, device: str = "cpu"):
providers = [("CPUExecutionProvider")]
self.onnx_session =…

Bombex
- 1
- 2