2

In tf.slim, I'd like to create a stack of fully-connected layers with dropout.

To the example from documentation: slim.stack(x, slim.fully_connected, [32, 64, 128], scope='fc'), I'd like to add dropout.

Is it possible to use slim.stack or do I have to go back to the verbose approach?

(pseudo-code) for every layer:
   slim.dropout(slim.fully_connected(...)
mkbubba
  • 61
  • 6

2 Answers2

2

Just look at the code: https://github.com/tensorflow/tensorflow/blob/master/tensorflow/contrib/layers/python/layers/layers.py#L1976. You need a local function that has the following signature:

outputs = layer(outputs, *layer_args, **kwargs)
guinny
  • 1,522
  • 10
  • 19
1

Based on @guinny's comment, I can create a local function:

layer = lambda inputs, layer_args, **kwargs:      
    slim.dropout(slim.fully_connected(inputs, layer_args, **kwargs))
with slim.arg_scope([slim.dropout], keep_prob=dropout_keep_prob):
    layers = slim.stack(inputs,
                        layer,
                        layer_sizes,
                        activation_fn=nonlinearity,
                        scope=scope)
mkbubba
  • 61
  • 6
  • 1
    Looks great! Although, for the sake of readability, it's probably better to write out the layers in most cases. Only use this if you have a very deep net. – guinny Jan 27 '17 at 06:53