0

This sounds like something the documentation should cover, however, it doesn't and in the many tutorials I've seen nobody seems to have figured this out.

I'm trying to run onnx runtime web with a BERT model exported from hugging face. I do get all the steps working and the predictions, however I'm trying to find a built-in way to apply softmax to my predictions to get the probabilities.

From ONNX web documentation I can see the softmax operation is supported.

But from the API I have no clue on how to invoke it.

Does anyone know how to do this?

Thanks!

1 Answers1

0

ONNX Runtime lets you run a model rather than individual operators. i.e. the Softmax would have to be within the model.

Softmax is relatively easy to implement though so you'd need to do that if you don't have a way to add it to the model.

e.g. https://github.com/microsoft/onnxruntime-inference-examples/blob/743a310db7a90db1e8ffddc2768cb48258e2674b/mobile/examples/Maui/MauiVisionSample/MauiVisionSample/Models/Mobilenet/MobilenetSample.cs#L66-L69

Scott McKay
  • 190
  • 1
  • 8