I am trying to recreate the work done in this video, CppDay20Interoperable AI: ONNX & ONNXRuntime in C++ (M. Arena, M.Verasani).The github repository for the demo code is here .So far I have trained a regression model using TensorFlow and have converted into ONNX for inference in c++. But the created ONNX runtime session is unable to read the input shape of my model; the input shape is returning a value of -1.
Ort::Env env;
Ort::Session session{env,model_path, Ort::SessionOptions{} };
Ort::AllocatorWithDefaultOptions allocator;
auto* inputName = session.GetInputName(0, allocator);
std::cout << "Input name: " << inputName << "\n";
auto* outputName = session.GetOutputName(0, allocator);
std::cout << "Output name: " << outputName << "\n";
auto inputShape = session.GetInputTypeInfo(0).GetTensorTypeAndShapeInfo().GetShape();
//model has 5 inputs
std::vector<float> inputValues = {1, 2, 3, 4, 5 };
// where to allocate the tensors
auto memoryInfo = Ort::MemoryInfo::CreateCpu(OrtDeviceAllocator, OrtMemTypeCPU);
// create the input tensor (this is not a deep copy!)
auto inputOnnxTensor = Ort::Value::CreateTensor<float>(memoryInfo,
inputValues.data(), inputValues.size(),
inputShape.data(), inputShape.size());
// the API needs the array of inputs you set and the array of outputs you get
array inputNames = { inputName };
array outputNames = { outputName };
// finally run the inference!
auto outputValues = session.Run(
Ort::RunOptions{ nullptr }, // e.g. set a verbosity level only for this run
inputNames.data(), &inputOnnxTensor, 1, // input to set
outputNames.data(), 1);
output :
Number of model inputs: 1
Number of model outputs: 1
Input name: input_1
Output name: Identity
tried creating tensor with negative value in shape
Any suggestions to make the inference code work?