We have our ONNX model, and we're trying to use it in our app. Running the project from the IDE works just fine, but running the jar file produces this error. We debugged this issue and found that the supportedEngines in IDE runtime includes ONNXRuntime and PyTorch, but the JAR file includes ONNXRuntime only. The issue happens when we reach StackBatchifier.batchify(), on String inputName = ((NDArray)inputs[0].get(i)).getName();
We have the following dependencies in our gradle file: api("org.apache.logging.log4j:log4j-slf4j-impl:2.18.0") api("ai.djl:model-zoo:0.21.0-SNAPSHOT") api("ai.djl.huggingface:tokenizers:0.21.0-SNAPSHOT") api("ai.djl.pytorch:pytorch-model-zoo:0.21.0-SNAPSHOT") api("ai.djl.onnxruntime:onnxruntime-engine:0.19.0") api("org.jetbrains.kotlin:kotlin-stdlib:1.7.20")
Do we need to put any jar configurations for it to work?
Error Message Exception in thread "main" ai.djl.translate.TranslateException: java.lang.UnsupportedOperationException: This NDArray implementation does not currently support this operation at ai.djl.inference.Predictor.batchPredict(Predictor.java:189) at ai.djl.inference.Predictor.predict(Predictor.java:126) at ProfanityPredictionModel.predict(ProfanityPredictionModel.kt:30) at TestModel.main(TestModel.kt:18) Caused by: java.lang.UnsupportedOperationException: This NDArray implementation does not currently support this operation at ai.djl.ndarray.NDArrayAdapter.getAlternativeArray(NDArrayAdapter.java:1225) at ai.djl.ndarray.NDArrayAdapter.getNDArrayInternal(NDArrayAdapter.java:1173) at ai.djl.ndarray.NDArrays.stack(NDArrays.java:1825) at ai.djl.ndarray.NDArrays.stack(NDArrays.java:1785) at ai.djl.translate.StackBatchifier.batchify(StackBatchifier.java:52) at ai.djl.inference.Predictor.processInputs(Predictor.java:217) at ai.djl.inference.Predictor.batchPredict(Predictor.java:177) ... 3 more