Questions tagged [tensorflow-serving]

Detailed developer documentation on TensorFlow Serving is available:

1260 questions
78
votes
3 answers

Unable to use sudo commands within Docker, "bash: sudo: command not found" is displayed

I have installed TensorFlow using the following command docker run -it b.gcr.io/tensorflow/tensorflow:latest-devel and I need to set up TensorFlow Serving on a windows machine. I followed the instructions and while running the below-mentioned sudo…
Vasanti
  • 1,207
  • 2
  • 12
  • 24
46
votes
5 answers

Xcode version must be specified to use an Apple CROSSTOOL

I try to build tensorflow-serving using bazel but I've encountered some errors during the building ERROR:/private/var/tmp/_bazel_Kakadu/3f0c35881c95d2c43f04614911c03a57/external/local_config_cc/BUILD:49:5: in apple_cc_toolchain rule…
Ivan Shelonik
  • 1,958
  • 5
  • 25
  • 49
45
votes
1 answer

Where should pre-processing and post-processing steps be executed when a TF model is served using TensorFlow serving?

Typically to use a TF graph, it is necessary to convert raw data to numerical values. I refer to this process as a pre-processing step. For example, if the raw data is a sentence, one way is to do this is to tokenize the sentence and map each word…
MajidL
  • 731
  • 6
  • 11
45
votes
5 answers

Tensorflow serving No versions of servable found under base path

I was following this tutorial to use tensorflow serving using my object detection model. I am using tensorflow object detection for generating the model. I have created a frozen model using this exporter (the generated frozen model works using…
26
votes
1 answer

How can I use tensorflow serving for multiple models

How can I use multiple tensorflow models? I use docker container. model_config_list: { config: { name: "model1", base_path: "/tmp/model", model_platform: "tensorflow" }, config: { name: "model2", base_path:…
onuryartasi
  • 587
  • 2
  • 6
  • 13
24
votes
4 answers

How to import an saved Tensorflow model train using tf.estimator and predict on input data

I have save the model using tf.estimator .method export_savedmodel as follows: export_dir="exportModel/" feature_spec = tf.feature_column.make_parse_example_spec(feature_columns) input_receiver_fn =…
nayan
  • 243
  • 1
  • 4
  • 6
24
votes
2 answers

Example for Deploying a Tensorflow Model via a RESTful API

Is there any example code for deploying a Tensorflow Model via a RESTful API? I see examples for a command line program and for a mobile app. Is there a framework for this or people just load the model and expose the predict method via a web…
Nikhil
  • 2,230
  • 6
  • 33
  • 51
22
votes
5 answers

AttributeError: module 'tensorflow' has no attribute 'gfile'

I trained a simple mnist model with tensorflow 2.0 on Google Colab and saved it in the .json format. Click here to check out the Colab Notebook where I've written the code. Then on running the command !simple_tensorflow_serving --model_base_path="/"…
22
votes
2 answers

How to install TensorFlow-gpu with cuda8.0?

I tried to install it according to the instructions on official website, which results in an ImportError when I import tensorflow: ImportError: libcublas.so.9.0: cannot open shared object file: No such file or directory I run the code cat…
Ink
  • 845
  • 1
  • 13
  • 31
22
votes
3 answers

TensorFlow REST Frontend but not TensorFlow Serving

I want to deploy a simple TensorFlow model and run it in REST service like Flask. Did not find so far good example on github or here. I am not ready to use TF Serving as suggested in other posts, it is perfect solution for Google but it overkill for…
Sergei Chicherin
  • 2,031
  • 1
  • 18
  • 24
22
votes
2 answers

No variable to save error in Tensorflow

I am trying to save the model and then reuse it for classifying my images but unfortunately i am getting errors in restoring the model that i have saved. The code in which model has been created : # Deep Learning # ============= # # Assignment 4 #…
kkk
  • 1,850
  • 1
  • 25
  • 45
20
votes
2 answers

How to keep lookup tables initialized for prediction (and not just training)?

I create a lookup table from tf.contrib.lookup, using the training data (as input). Then, I pass every input through that lookup table, before passing it through my model. This works for training, but when it comes to online prediction from this…
aka
  • 203
  • 2
  • 6
19
votes
1 answer

Graph optimizations on a tensorflow serveable created using tf.Estimator

Context: I have a simple classifier based on tf.estimator.DNNClassifier that takes text and output probabilities over an intent tags. I am able to train an export the model to a serveable as well as serve the serveable using tensorflow serving. …
o-90
  • 17,045
  • 10
  • 39
  • 63
19
votes
1 answer

Tensorflow serving: "No assets to save/writes" when exporting models

Recently I am trying to deploy deep learning services using tensorflow serving. But I got the following infos when exporting my model: INFO:tensorflow: No assets to save INFO:tensorflow: No assets to write INFO:tensorflow: SavedModel written…
Moyan
  • 327
  • 3
  • 8
19
votes
2 answers

Add Tensorflow pre-processing to existing Keras model (for use in Tensorflow Serving)

I would like to include my custom pre-processing logic in my exported Keras model for use in Tensorflow Serving. My pre-processing performs string tokenization and uses an external dictionary to convert each token to an index for input to the…
Qululu
  • 1,040
  • 2
  • 12
  • 23
1
2 3
83 84