0

Hello, I am trying to create a .pb or .ckpt file with a BERT Question Answer model to after the transformation convert it into a tflite file as the official tensorflow documentation says but I can't get it yet, thank you very much

!pip install transformers
from transformers import pipeline
nlp = pipeline("question-answering")
context = r"""
Extractive Question Answering is the task of extracting an answer 
from 
a text given a question. An example of a
question answering dataset is the SQuAD dataset, which is entirely 
based on that task. If you would like to fine-tune
a model on a SQuAD task, you may leverage the `run_squad.py`.
"""
print(nlp(question="What is extractive question answering?", 
context=context))
print(nlp(question="What is a good example of a question answering 
dataset?", context=context))
  • What does it mean you "can't get it yet"? What have you done? what is the expected result? what is the error you are getting? – Proko Mar 23 '21 at 20:38
  • First of all, thank you very much for the answer, I am trying to make an Android application, what I am trying to do is generate a Question and Answer model with BERT and TensorflowLite that responds to a custom context. My question is the following: I am looking to generate a .tflite file and I do not know exactly which would be the best option of the three options that I present below, I consider that there are three conversion routes that start from two API levels: – Diego Norberto Bermudez Castil Apr 10 '21 at 17:38
  • 1) from a high level tf.keras obtain: A) Keras model B) Saved model 2) from a low level of the tf. * Api get: A) Specific functions B) Saved model. Of the possible 2 levels of API above, if I'm not mistaken, there are 3 possible options: – Diego Norberto Bermudez Castil Apr 10 '21 at 17:39
  • 1) The model saved in .pb format would be converted directly into .tflite by means of the "TFLiteConverter" converter. 2) The Keras model saved in a format that I do not identify that I think could be between (.pb and .ckpt) should be able to freeze, generate another file (.pb), following the above, optimize for inference thus generating a frozen graph in format (.pb) and finally it will be possible to use the converter "TFLiteConverter" and obtain the format (.tflite) – Diego Norberto Bermudez Castil Apr 10 '21 at 17:39
  • 3) The specific functions saved in a format that I do not identify that I think could be between (.pb and .ckpt) should be able to freeze, generate another file (.pb), following the above, optimize for inference thus generating a frozen graph in format (.pb) and finally it will be possible to use the "TFLiteConverter" converter and get the (.tflite) format. I would appreciate if you could guide me on which of the 3 options is the most appropriate, and if you could also support me with a script for the particular case of Question & Answer with BERT – Diego Norberto Bermudez Castil Apr 10 '21 at 17:39
  • I attach the following images to summarize it: – Diego Norberto Bermudez Castil Apr 10 '21 at 17:43
  • https://i.stack.imgur.com/kIEG1.png – Diego Norberto Bermudez Castil Apr 10 '21 at 17:43
  • https://github.com/tensorflow/tensorflow/blob/master/tensorflow/lite/g3doc/images/convert/convert.png – Diego Norberto Bermudez Castil Apr 10 '21 at 17:43
  • this is an example of the expected result: https://www.tensorflow.org/lite/examples/bert_qa/images/screenshot.gif – Diego Norberto Bermudez Castil Apr 10 '21 at 18:01

0 Answers0