2

I want to be able to use the dnnclassifier (estimator) on top of IIS using tensorflowsharp. The model has previously been trained in python. I got so far that I can now generate PB files, know the correct input/outputs, however I am stuck in tensorflowsharp using string inputs.

I can create a valid .pb file of the iris dataset. It uses the following feate_spec:

{'SepalLength': FixedLenFeature(shape=(1,), dtype=tf.float32, default_value=None), 'SepalWidth': FixedLenFeature(shape=(1,), dtype=tf.float32, default_value=None), 'PetalLength': FixedLenFeature(shape=(1,), dtype=tf.float32, default_value=None), 'PetalWidth': FixedLenFeature(shape=(1,), dtype=tf.float32, default_value=None)}

I have created a simple c# console to try and spin it up. The input should be an "input_example_tensor" and the output is located in "dnn/head/predictions/probabilities". This I discoved after alex_zu provided help using the saved_model_cli command here.

As far as I am aware all tensorflow estimator API's work like this.

Here comes the problem: the input_example_tensor should be of a string format which will be parsed internally by the ParseExample function. Now i am stuck. I have found TFTensor.CreateString, but this doesn't solve the problem.

using System;
using TensorFlow;

namespace repository
{
    class Program
    {
        static void Main(string[] args)
        {
            using (TFGraph tfGraph = new TFGraph()){
                using (var tmpSess = new TFSession(tfGraph)){
                    using (var tfSessionOptions = new TFSessionOptions()){
                        using (var metaGraphUnused = new TFBuffer()){

                            //generating a new session based on the pb folder location with the tag serve
                            TFSession tfSession = tmpSess.FromSavedModel(
                                tfSessionOptions,
                                null,
                                @"path/to/model/pb", 
                                new[] { "serve" }, 
                                tfGraph, 
                                metaGraphUnused
                            );

                            //generating a new runner, which will fetch the tensorflow results later
                            var runner = tfSession.GetRunner();

                            //this is in the actual tensorflow documentation, how to implement this???
                            string fromTensorflowPythonExample = "{'SepalLength': [5.1, 5.9, 6.9],'SepalWidth': [3.3, 3.0, 3.1],'PetalLength': [1.7, 4.2, 5.4],'PetalWidth': [0.5, 1.5, 2.1],}";

                            //this is the problem, it's not working...
                            TFTensor rawInput = new TFTensor(new float[4]{5.1f,3.3f,1.7f,0.5f});
                            byte[] serializedTensor = System.Text.Encoding.ASCII.GetBytes(rawInput.ToString());
                            TFTensor inputTensor = TensorFlow.TFTensor.CreateString (serializedTensor);

                            runner.AddInput(tfGraph["input_example_tensor"][0], inputTensor);
                            runner.Fetch("dnn/head/predictions/probabilities", 0);

                            //start the run and get the results of the iris example
                            var output = runner.Run();
                            TFTensor result = output[0];

                            //printing response to the client
                            Console.WriteLine(result.ToString());
                            Console.ReadLine();
                        } 
                    }
                }
            }
        }
    }
}

This example will give the following error:

An unhandled exception of type 'TensorFlow.TFException' occurred in TensorFlowSharp.dll: 'Expected serialized to be a vector, got shape: []
 [[Node: ParseExample/ParseExample = ParseExample[Ndense=4, Nsparse=0, Tdense=[DT_FLOAT, DT_FLOAT, DT_FLOAT, DT_FLOAT], dense_shapes=[[1], [1], [1], [1]], sparse_types=[], _device="/job:localhost/replica:0/task:0/device:CPU:0"](_arg_input_example_tensor_0_0, ParseExample/ParseExample/names, ParseExample/ParseExample/dense_keys_0, ParseExample/ParseExample/dense_keys_1, ParseExample/ParseExample/dense_keys_2, ParseExample/ParseExample/dense_keys_3, ParseExample/Const, ParseExample/Const, ParseExample/Const, ParseExample/Const)]]'

How can I serialize tensors in such a way that I can use the pb file correctly?

I also posted the issue on github, here you can find the iris example python file, pb file and the console applications. In my opinion solving this creates a neat solution for all tensorflow users having ancient production environments (like me).

Klaas
  • 243
  • 1
  • 3
  • 12
  • @Klass did you ever solve this? Currently I am attempting to load a `pb` file generated using `simple_save` (python) and am running into similar issues. – Questioning Aug 02 '18 at 14:18

1 Answers1

0

The Expected serialized to be a vector, got shape: [] error can be fixed by using an overload of the TFTensor.CreateString function: Instead of directly taking a string, the model apparently expects a vector containing a single string:

TFTensor inputTensor = TFTensor.CreateString(new byte[][] { bytes }, new TFShape(1));

The input_example_tensor in your case now expects a serialized Example protobuf message (see also the docs and the example.proto file).

Using the protobuf compiler, I've generated a C# file containing the Example class. You can download it from here: https://pastebin.com/iLT8MUdR. Specifically, I used this online tool with CSharpProtoc and replaced the import "tensorflow/core/example/feature.proto"; line by the messages defined in that file.

Once you've added the file to your project, you'll need a package reference to Google.Protobuf. Then, you can pass serialized examples to the model like this:

Func<float, Tensorflow.Feature> makeFeature = (float x) => {
    var floatList = new Tensorflow.FloatList();
    floatList.Value.Add(x);
    return new Tensorflow.Feature { FloatList = floatList };
};

var example = new Tensorflow.Example { Features = new Tensorflow.Features() };
example.Features.Feature.Add("SepalLength", makeFeature(5.1f));
example.Features.Feature.Add("SepalWidth",  makeFeature(3.3f));
example.Features.Feature.Add("PetalLength", makeFeature(1.7f));
example.Features.Feature.Add("PetalWidth",  makeFeature(0.5f));

TFTensor inputTensor = TFTensor.CreateString(
    new [] { example.ToByteArray() }, new TFShape(1));

runner.AddInput(tfGraph["input_example_tensor"][0], inputTensor);
runner.Fetch("dnn/head/predictions/probabilities", 0);

//start the run and get the results of the iris example
var output = runner.Run();
TFTensor result = output[0];