Questions tagged [tflite]
269 questions
0
votes
0 answers
How to use a GPU with TFlite in Python
We are interested in using TFlite with Python and with GPU support. How can we configure TFlite in Python to enable the GPU delegate? If it cannot be done currently, what should we change in TFLite to allow Python to use the GPU delegate?
It is…

Robert Foley
- 31
- 4
0
votes
1 answer
Why tflite model output shape is different than the original model converted from T5ForConditionalGeneration?
T5ForConditionalGeneration Model to translate English to German
from transformers import T5TokenizerFast, T5ForConditionalGeneration
tokenizer = T5TokenizerFast.from_pretrained("t5-small")
model =…

kolibyte
- 1
- 2
0
votes
0 answers
How to create a method to display results from imported tflite model?
I am implementing an image classification app and I imported a tflite model. While trying to call my tflite model, I am having trouble calling the the model to the appropriate views.
Here is my tflite model:
val model =…

Yet I fish
- 3
- 2
0
votes
0 answers
Cannot find the right tf_runtime wheel for Raspberry Pi 3B+, Python 3.10.2
Whenever I run the code:
tf.lite.Interpreter(model_path=modelName)
The error is:
AttributeError: module 'tensorflow' has no attribute 'lite'
I followed this guide beforehand.
I tried adding…

Ossas_
- 1
- 1
0
votes
0 answers
Need to understand the ML model deployment through MicroMutableOpResolver
I am new to tensorflow lite, and I have noticed that many examples are using static tflite::MicroMutableOpResolver < > micro_op_resolver;
So, my question is: how many layers can we add here when deploying the model?
Or can the layers be exactly the…

Divya Tripathi
- 13
- 4
0
votes
0 answers
Can't Import TFLite Model To Android Studio, How To Solve It?
I have a problem where I can't import a new Tflite model to android studio. the Tensorflow Model TFLite feature seems to be disabled. The message says "Requires Android Gradle plugin 4.1.0-alpha04 or newer). How can I import the TFLite model into…

Annisa Lianda
- 83
- 5
0
votes
1 answer
What happened with TensorFlow Lite documentation for the Interpreter class
I started working on a project using TensorFlow Lite in C++. I have often looked up information about the API in the official reference. Almost all of the methods are used were listed in the Interpreter class. However, a few days ago I noticed, that…

matko031
- 53
- 7
0
votes
0 answers
Is there a way to use a tflite-runtime version higher than 2.5.0 on an armv7 development board
System information
Linux Ubuntu 18.04
armv7 board
python3.6
question description
I trained a model on my linux server (ubuntu18.04,Intel(R) Xeon(R) W-2145 CPU) and exported it to tflite mode, and the model on the server (tflite-runtime=2.10.0) works…

Xuefei Lv
- 41
- 4
0
votes
0 answers
Predict TensorFlow lite model in c++
I've created a TF model and converted it to be used in an Arduino nano 33 ble. I can load the model using the following line:
model = tflite::GetModel(model_tflite);
but I cannot find a way to use the model to make predictions as I can do with the…
0
votes
0 answers
Is there a way to use a Self Organizing Map in an M5Stack?
i'm trying to run a SOM on a M5Stack using minisom (python library for SOM) and i tried in two ways:
1 - Using tflite;
2 - Using UIFlow (official IDE for M5Stack) and Micropython.
For both ways i'm at a dead end.
1 - For the first way i tried to…
0
votes
0 answers
In TensorFlow Lite Micro are Dense/Dropout/Flatten Layers available?
I know the available ops are in all_ops_resolver.cc but there are non for Dropout, Flatten or Dense.
The magic_wand example trains a model using these layers.
def build_cnn(seq_length):
"""Builds a convolutional neural network in Keras."""
model…

codeshredder726b
- 53
- 10
0
votes
0 answers
Deploying tensorflow RNN models (other than keras LSTM) to a microcontroller without unrolling the network?
Goal
I want to compare different types of RNN tflite-micro models, built using tensorflow, on a microcontroller based on their accuracy, model size and inference time. I have also created my own custom RNN cell that I want to compare with the LSTM…
0
votes
1 answer
How to calculate the matrix's inverse using tflite
as far as i known, the matrix's inverse is a common operator.
while tf.raw_ops.MatrixInverse is not supported in tflite and BatchMatrixInverse is not available in GraphDef version 1205.
How can i calculate the inverse of the matrix in tflite?
Best…

jiaocha
- 21
- 5
0
votes
1 answer
Which model should I use for object recognition on mobile devices?
I am working on a project where I need the app to recognize a special character, just one small graphic, from a photographed document. Something similar to the example from the picture. More specifically, the app would use this character to…

Slobodan M.
- 31
- 1
- 3
0
votes
0 answers
Flutter Tensorflow Lite image prediction not working as expected
i have created a tensorflow model that has twelve classes for the image classification. My Jupiter notebook is working as expected and able to classify most of the images right. After that i converted my tensorflow model to a tflite model and tried…

Dominik Hartl
- 101
- 11