I am trying to get probability/confidence score from the model on my android application.
While creating the ONNX model in python, I am able to see the confidence/probability score for a given input. However, when I give the same input on Android, the OrtSession.run()
method returns only the class label.
I tried adding zipmap
in the options
while converting sklearn model to onnx but that did not help either.
I do not know if the issue is on Android side or while creating the .onnx
model. Hence attaching the code of both the sides.
ONNX code
# sample classification
from sklearn.datasets import load_iris
from sklearn import tree
# from sklearn.linear_model import LogisticRegression
iris = load_iris()
X, y = iris.data, iris.target
# clf = tree.DecisionTreeClassifier()
clf = tree.DecisionTreeClassifier()≠
clf = clf.fit(X, y)
# Convert into ONNX format
from skl2onnx import convert_sklearn
from skl2onnx.common.data_types import FloatTensorType
import numpy as np
initial_type = [('float_input', FloatTensorType([None, 4]))]
options = {id(clf): {'zipmap': True}}
onx_dt = convert_sklearn(clf, initial_types=initial_type,
options=options)
with open("dt_iris.onnx", "wb") as f:
f.write(onx_dt.SerializeToString())
# Compute the prediction with ONNX Runtime
import onnxruntime as rt
import numpy as np
sess = rt.InferenceSession(onx.SerializeToString())
res = sess.run(None, {'float_input': np.array([[0.9, 0.7, 0.9, 0.4]],
dtype=np.float32)}) #X_test.astype(numpy.float32)
print(res)
print(res[1][:2])
print("probabilities type:", type(res[1]))
print("type for the first observations:", type(res[1][0]))
The output for the above classification is
[array([1], dtype=int64), [{0: 0.09890109300613403, 1: 0.901098906993866}]]
[{0: 0.09890109300613403, 1: 0.901098906993866}]
probabilities type: <class 'list'>
type for the first observations: <class 'dict'>
Android code The logic to get the classification in android is
private fun runPrediction(input: String, ortSession: OrtSession, ortEnvironment: OrtEnvironment): String {
// Get the name of the input node
//val value = input.toFloat()
val value = floatArrayOf(0.6f,0.6f,0.6f,0.6f)
val inputName = ortSession.inputNames?.iterator()?.next()
// Make a FloatBuffer of the inputs
val floatBufferInputs = FloatBuffer.wrap(value)
// Create input tensor with floatBufferInputs of shape ( 1 , 1 )
val inputTensor = OnnxTensor.createTensor(ortEnvironment, floatBufferInputs, longArrayOf(1, 4))
// Run the model
val options = OrtSession.RunOptions().apply {
logLevel = OrtLoggingLevel.ORT_LOGGING_LEVEL_VERBOSE
}
val results = ortSession.run(mapOf(inputName to inputTensor), options)
val rawOutput = results[0].value
return (rawOutput as LongArray).toList().toString()
}
I have referred the following blog post for implementation https://towardsdatascience.com/deploying-scikit-learn-models-in-android-apps-with-onnx-b3adabe16bab
ONNX runtime version used is 1.14.1