0

Is there a way to take MxNet model adn deploy it to embedded device directly? As "embedded", objective is to have super lightweight, optionally optimized for ARM/neon.

krishnakamathk
  • 143
  • 2
  • 9

1 Answers1

0

Yes, here is the tutorial which explains step by step how to run MxNet inference ARM-device (Raspberry Pi in that case) - https://mxnet.incubator.apache.org/tutorials/embedded/wine_detector.html

Sergei
  • 1,617
  • 15
  • 31
  • Thanks. But I was really hoping for something embedded i.e. C++ code examples whcih are optimized for ARM/Neon platform. – krishnakamathk Mar 09 '18 at 22:19
  • I don't think there is any specific guidance on how to write high performance code for ARM/ Neon platform. But as for being lightweight, take a look into that project - https://github.com/apache/incubator-mxnet/tree/master/amalgamation It combines all prediction api into a single file, making it more lightweight – Sergei Mar 10 '18 at 01:00
  • But the concern is shared object file is huge! Last time I checked it was about 100 MB and not a good thing for Embedded devices. – krishnakamathk Mar 29 '18 at 17:47