1

I've been trying to figure out how to implement a skip/residual connection using tf-slim but I can't seem to get it.

I know that a skip connection basically provides input to, not only the immediate next layer, but also the layer after that. I've also read a couple papers on skip layers and recall them mentioning F(x) + x which I assume is the activation function on x added with another x. I'm not sure how that helps with implementing skip/residual layers in tf-slim though.

Below is the code I've written for it, but I'm not 100% sure if I'm even doing it correctly. The model runs but I'm not sure if it's taking advantage of skip connections and everything!

    input_layer = slim.fully_connected(input, 6000, activation_fn=tf.nn.relu)
    drop_layer_1 = slim.dropout(input_layer, 0.5)
    hidden_layer_1 = slim.fully_connected(drop_layer_1, 6000, activation_fn=tf.nn.relu)
    drop_layer_2 = slim.dropout(hidden_layer_1, 0.5)
    skip_layer = tf.add(input_layer, drop_layer_2)
    activate_layer = tf.nn.relu(skip_layer)
    output = slim.fully_connected(activate_layer, num_classes, activation_fn=tf.nn.softmax)

Any help would be greatly appreciated. Thanks in advance!

dooder
  • 529
  • 1
  • 6
  • 14

0 Answers0