2

I'm trying to re-implement a research paper code in tf.keras, in init block it was written as:

with slim.arg_scope([slim.conv2d,separable_conv],activation_fn=tf.nn.relu6, normalizer_fn=slim.batch_norm):
    with slim.arg_scope([slim.batch_norm], is_training=is_training, activation_fn=None):
        with tf.variable_scope(name):
            net = slim.conv2d(inputs, num_outputs=depth, kernel_size=3, stride=2, scope="conv") #padding same

I didn't find a equivalent in tf.keras.layer.Conv2D arguments for normalizer_fn=slim.batch_norm. How to achieve this in keras?

I tried:

model.add(Conv2D("some arguments") #0
model.add(BatchNormalization())

Is this a valid equivalent to the above tf.contrib.slim code. With limited documentation of tf.contrib.slim, I'm really confused.

PRNV JB
  • 51
  • 6

0 Answers0