0

I know the available ops are in all_ops_resolver.cc but there are non for Dropout, Flatten or Dense. The magic_wand example trains a model using these layers.

def build_cnn(seq_length):
  """Builds a convolutional neural network in Keras."""
  model = tf.keras.Sequential([
      tf.keras.layers.Conv2D(
          8, (4, 3),
          padding="same",
          activation="relu",
          input_shape=(seq_length, 3, 1)),  # output_shape=(batch, 128, 3, 8)
      tf.keras.layers.MaxPool2D((3, 3)),  # (batch, 42, 1, 8)
      tf.keras.layers.Dropout(0.1),  # (batch, 42, 1, 8)
      tf.keras.layers.Conv2D(16, (4, 1), padding="same",
                             activation="relu"),  # (batch, 42, 1, 16)
      tf.keras.layers.MaxPool2D((3, 1), padding="same"),  # (batch, 14, 1, 16)
      tf.keras.layers.Dropout(0.1),  # (batch, 14, 1, 16)
      tf.keras.layers.Flatten(),  # (batch, 224)
      tf.keras.layers.Dense(16, activation="relu"),  # (batch, 16)
      tf.keras.layers.Dropout(0.1),  # (batch, 16)
      tf.keras.layers.Dense(4, activation="softmax")  # (batch, 4)

When loading the model I don't see these layers anywhere. Also searching over the whole codebase did't not bring much clarity.

  static tflite::MicroMutableOpResolver<5> micro_op_resolver;  // NOLINT
  micro_op_resolver.AddConv2D();
  micro_op_resolver.AddDepthwiseConv2D();
  micro_op_resolver.AddFullyConnected();
  micro_op_resolver.AddMaxPool2D();
  micro_op_resolver.AddSoftmax();
  • 2
    Lite is only for inference, and dropout is only for training, right? Dense is the same thing as fully connected. – user253751 Nov 17 '22 at 09:37
  • TFLite Micro is used for inference (C++), Lite for training. I think I understand now, these layers during training affect only the model shape but the model does not contain these layers (simple to understand in the case of dropout). – codeshredder726b Nov 17 '22 at 09:47
  • 1
    A layer is not an "op", they are just Python wrappers around defining variables, applying ops etc. As long as the layers use supported ops, it should be fine. – xdurch0 Nov 17 '22 at 09:53
  • Is there anywhere in the documentation what ops a certain layer is using? I couldn't find it in the keras class descriptions. If I want to use e.g. GlobalAveragePooling1D or BatchNormalization, how can I check if these layers are supported in inference in TFLM? – codeshredder726b Nov 22 '22 at 14:01

0 Answers0