I'am trying to use Tensorflowsharp in a Project in Unity.
The problem i'm facing is that for the transform you usually use a second Graph to transform the input into a tensor. The used functions DecodeJpg and DecodePng are not supported on Android so how can you transform that input into a tensor ?
private static void ConstructGraphToNormalizeImage(out TFGraph graph, out TFOutput input, out TFOutput output, TFDataType destinationDataType = TFDataType.Float)
{
const int W = 224;
const int H = 224;
const float Mean = 117;
const float Scale = 1;
graph = new TFGraph();
input = graph.Placeholder(TFDataType.String);
output = graph.Cast(graph.Div(
x: graph.Sub(
x: graph.ResizeBilinear(
images: graph.ExpandDims(
input: graph.Cast(
graph.DecodeJpeg(contents: input, channels: 3), DstT: TFDataType.Float),
dim: graph.Const(0, "make_batch")),
size: graph.Const(new int[] { W, H }, "size")),
y: graph.Const(Mean, "mean")),
y: graph.Const(Scale, "scale")), destinationDataType);
}
Other solutions seem to create non accurate results.
Maybe somehow with a Mat object?
and my EDIT: I implemented something comparabel in c# in Unity and it works partially. It is just not accurate at all. How am i gonna find out the Mean? And i could not find anything about the rgb order.? I'm really new to this so maybe i have just overlooked it. (on Tensorflow.org) Using MobileNet trained in 1.4.
public TFTensor transformInput(Color32[] pic, int texturewidth, int textureheight)
{
const int W = 224;
const int H = 224;
const float imageMean = 128;
const float imageStd = 128;
float[] floatValues = new float[texturewidth * textureheight * 3];
for (int i = 0; i < pic.Length; ++i)
{
var color = pic[i];
var index = i * 3;
floatValues[index] = (color.r - imageMean) / imageStd;
floatValues[index + 1] = (color.g - imageMean) / imageStd;
floatValues[index + 2] = (color.b - imageMean) / imageStd;
}
TFShape shape = new TFShape(1, W, H, 3);
return TFTensor.FromBuffer(shape, floatValues, 0, floatValues.Length);
}