0

Learnt that XNNPACK can accelerate computation a lot in general, thus tried on Xcode/Objective-C. Here's the imp according to official doc

// add XNNPACK
TFLInterpreterOptions *options = [[TFLInterpreterOptions alloc] init];
options.useXNNPACK = YES;
    
// init interpreter
NSString *modelPath = [[NSBundle mainBundle] pathForResource:@"mymodel" ofType:@"tflite"];
_modelInterpreter = [[TFLInterpreter alloc] initWithModelPath:modelPath
                                                      options:options
                                                        error:&error];

With CocoaPods, I tried with TensorFlowLite 2.3.0, 2.4.0, and the latest x.x.x-nighly version. in all cases, whenever XNNPACK is enabled, the init fails. Internally it fails at this line in TFLInterpreter.mm:

_interpreter = TfLiteInterpreterCreate(model, cOptions);

Am I missing something or it's just XNNPACK is not properly implemented in the lib yet?

Thanks!

Kitetaka
  • 527
  • 4
  • 20
  • Could you provide more information of the failure? There are other reasons to fail at the TFLiteInterpreterCreate method, for example, invalid model argument and so on. – Jae sung Chung May 05 '21 at 12:56
  • Error when logged is ---- Error Domain=org.tensorflow.lite.interpreter Code=4 "Failed to create the interpreter." UserInfo={NSLocalizedDescription=Failed to create the interpreter.}, and just by setting option.useXNNPACK = NO, everything gets back to normal and the model runs fine. So far I can only assume it's this flag caused the failure – Kitetaka May 05 '21 at 13:44
  • 1
    Could you upload this bug report at the TensorFlow github to invite more relevant folks? – Jae sung Chung May 05 '21 at 13:53
  • no problem, will do – Kitetaka May 06 '21 at 01:11
  • I don't suppose a link to the opened issue for those with the same problem is too much to ask for? – SEMANTICALLY_INVALID Nov 29 '22 at 23:24

0 Answers0