Lets say, I am constructing a neural network like so:
x = tf.nn.conv2d(input, ...)
x = tf.nn.max_pool(x, ...)
x = tf.nn.dropout(x, keep_prob=1.)
x = tf.nn.thislayershallhavedropout(x,...)
x = tf.nn.dropout(x, keep_prob=.5)
Would this be an effective technique to tell TensorFlow to just dropout the layer thislayershallhavedropout
?
Basically, what I am trying to do is to tell TensorFlow to use dropout only on a single layer and not cascade back into earlier layers.