0

I am new to using onnxruntime and am using my friend's old code to evaluate some data on the torch and catboost binary classification models in C++. The code was working nicely with onnxruntime v1.6.0 but when I updated it to v1.14.0, the catboost model does not work anymore.

I save the catboost model as the documentation dictates (same as before):

model.save_model(
    f'{filename}',
    format="onnx",
    export_parameters={
        'onnx_domain': 'ai.catboost',
    }
)

When initialising the onnxruntime in C++, the onnxruntime error complains what(): Input is not of type sequence or map. We prepare outputTypes in this way:

Ort::TypeInfo typeInfo = _session->GetOutputTypeInfo(i);
if (typeInfo.GetONNXType() == ONNX_TYPE_SEQUENCE) { // <- for a catboost model
    const OrtSequenceTypeInfo *sequence{};
    Ort::ThrowOnError(Ort::GetApi().CastTypeInfoToSequenceTypeInfo(typeInfo, &sequence));
    OrtTypeInfo *sequenceTypeInfo{};
    Ort::ThrowOnError(Ort::GetApi().GetSequenceElementType(sequence, &sequenceTypeInfo));
    Ort::TypeInfo sequenceElementInfo = Ort::TypeInfo{sequenceTypeInfo};
    if (sequenceElementInfo.GetONNXType() == ONNX_TYPE_TENSOR) {
        _outputNodeTypes.push_back(ONNX_TYPE_SEQUENCE);
    } else {
        _outputNodeTypes.push_back(ONNX_TYPE_MAP);
    }
} else {
    _outputNodeTypes.push_back(ONNX_TYPE_TENSOR); // <- for a torch model, which is just a tensor and works fine with both ort versions
}

The code actually breaks when accessing output values:

if (_outputNodeTypes.front() == ONNX_TYPE_MAP) {
    Ort::Value &value = outputTensors.front();
    Ort::Value map = value.GetValue(0, allocator); // <-- This line causes Ort::Exception()
    Ort::Value mapValue = map.GetValue(1, allocator);

    auto probabilities = gsl::span<float>(mapValue.GetTensorMutableData<float>(), 2);
    return probabilities[1];
}

If I print out the type of value just before the crash, onnxruntime v1.6.0 gives me ONNX_TYPE_SEQUENCE, while onnxruntime v1.14.0 gives me ONNX_TYPE_TENSOR and the exception is raised.

Anybody has an idea what onnxruntime change causes that issue and how to overcome it? To summarize, the same catboost model works fine with onnxruntime v1.6.0 but does not work with onnxruntime v1.14.0.

catboost model is available here.

gasar8
  • 306
  • 4
  • 12

1 Answers1

0

I think I got the answer. The onnxruntime Run function changed its output. According to this issue, the first element that is returned from Run is indeed ONNX_TYPE_TENSOR, so when I changed the value assignment from

Ort::Value &value = outputTensors.front();

to

Ort::Value &value = outputTensors[1];

it worked as it should.

gasar8
  • 306
  • 4
  • 12