0

After running the neural network in the browser, an error appears('The dtype of dict['input_tensor'] provided in model.execute(dict) must be int32, but was float32'), but it seems to me that the problem is not in the input_tensor, but in the neural network. Either I trained it incorrectly, or converted it incorrectly.

I trained a pre-trained neural network 'ssd_mobilenet_v2_fpnlite_320x320_coco17' in a colobarator. Saved the network in tensorflow format - save_model:

MetaGraphDef with tag-set: 'serve' contains the following SignatureDefs:

signature_def['__saved_model_init_op']:
  The given SavedModel SignatureDef contains the following input(s):
  The given SavedModel SignatureDef contains the following output(s):
    outputs['__saved_model_init_op'] tensor_info:
        dtype: DT_INVALID
        shape: unknown_rank
        name: NoOp
  Method name is: 

signature_def['serving_default']:
  The given SavedModel SignatureDef contains the following input(s):
    inputs['input_tensor'] tensor_info:
        dtype: DT_UINT8
        shape: (1, -1, -1, 3)
        name: serving_default_input_tensor:0
  The given SavedModel SignatureDef contains the following output(s):
    outputs['detection_anchor_indices'] tensor_info:
        dtype: DT_FLOAT
        shape: (1, 100)
        name: StatefulPartitionedCall:0
    outputs['detection_boxes'] tensor_info:
        dtype: DT_FLOAT
        shape: (1, 100, 4)
        name: StatefulPartitionedCall:1
    outputs['detection_classes'] tensor_info:
        dtype: DT_FLOAT
        shape: (1, 100)
        name: StatefulPartitionedCall:2
    outputs['detection_multiclass_scores'] tensor_info:
        dtype: DT_FLOAT
        shape: (1, 100, 249)
        name: StatefulPartitionedCall:3
    outputs['detection_scores'] tensor_info:
        dtype: DT_FLOAT
        shape: (1, 100)
        name: StatefulPartitionedCall:4
    outputs['num_detections'] tensor_info:
        dtype: DT_FLOAT
        shape: (1)
        name: StatefulPartitionedCall:5
    outputs['raw_detection_boxes'] tensor_info:
        dtype: DT_FLOAT
        shape: (1, 130944, 4)
        name: StatefulPartitionedCall:6
    outputs['raw_detection_scores'] tensor_info:
        dtype: DT_FLOAT
        shape: (1, 130944, 249)
        name: StatefulPartitionedCall:7
  Method name is: tensorflow/serving/predict

Concrete Functions:
  Function Name: '__call__'
    Option #1
      Callable with:
        Argument #1
          input_tensor: TensorSpec(shape=(1, None, None, 3), dtype=tf.uint8, name='input_tensor')

I checked the quality of the model (MobileNetv2, finds one class in the image), it is wonderful!

After that, I converted, save the model to tensorflow js, with the following code

tensorflowjs_converter \
--input_format=tf_saved_model \
--output_node_names='detection_boxes','detection_classes','detection_features','detection_multiclass_scores','num_detections','raw_detection_boxes','raw_detection_scores' \
--output_format=tfjs_graph_model \
/content/gdrive/MyDrive/model_scoarbord/export/inference_graph/saved_model
    /content/gdrive/MyDrive/model_scoarbord/web_model

When loading the network into the browser, I created a zero tensor to test the performance of the neural network.

<!DOCTYPE html>
<html lang="en">
<head>
    <meta charset="UTF-8">
    <meta http-equiv="X-UA-Compatible" content="IE=edge">
    <meta name="viewport" content="width=device-width, initial-scale=1.0">
    <script src="https://cdn.jsdelivr.net/npm/@tensorflow/tfjs@3.12.0/dist/tf.min.js"></script>
    <title>Document</title>
</head>
<body onload="">
    <script>
    async function loadModel() { 
    
    const modelUrl ='model.json';
    const model = await tf.loadGraphModel(modelUrl);
    console.log('Model loaded')

    //create a zero tensor to test the model
    const zeros = tf.zeros([1, -1, -1, 3]); 
    const zeros2 = zeros.toInt()
    //checking the performance of the model 
    model.predict(zeros).print();
    return model
    }
    loadModel()
    </script>
    
</body>
</html>

Accordingly, my directory looks like this: group1-shard1of3.bin group1-shard2of3.bin group1-shard3of3.bin index.html model.json

After starting the live server in visual code, I see the following error: util_base.js:153 Uncaught (in promise) Error: The dtype of dict['input_tensor'] provided in model.execute(dict) must be int32, but was float32

I tried to explicitly specify the type of tensor const zeros2 = zeros.toInt()

And made a test prediction with zeros2 And got other errors: graph_executor.js:166 Uncaught (in promise) Error: This execution contains the node 'StatefulPartitionedCall/map/while/exit/_435', which has the dynamic op 'Exit'. Please use model.executeAsync() instead. Alternatively, to avoid the dynamic ops, specify the inputs [StatefulPartitionedCall/map/TensorArrayV2Stack_1/TensorListStack]

Please tell me what am I doing wrong? How else can you check the performance of a neural network in the tfjs_graph_model format?

Moseich
  • 11
  • 1

0 Answers0