4

I am training a neural network using dropout regularization. I save the weights and biases the network is initialized with, so that I can repeat the experiment when I get good results.

However, the use of dropout introduces some randomness in the network: since dropout drops units randomly, each time I rerun the network, different units are being dropped - even though I initialize the network with the exact same weights and biases (if I understand this correctly).

Is there a way to make the dropout deterministic?

Maxim
  • 52,561
  • 27
  • 155
  • 209
rdv
  • 682
  • 2
  • 8
  • 19

1 Answers1

4

There are two primary ways to perform dropout in tensorflow:

Both functions accept a seed parameter that is used to generate the random mask. By default, seed=None, which means random seed, i.e. non-deterministic. In order to make the result deterministic, you either set the seed on per-op level or call tf.set_random_seed (sets the the graph-level random seed) or, better, both.

Example:

import tensorflow as tf

tf.InteractiveSession()
tf.set_random_seed(0)

x = tf.ones([10])
y = tf.nn.dropout(x, keep_prob=0.5, seed=0)
for i in range(5):
  print(y.eval())

z = tf.layers.dropout(inputs=x, rate=0.5, training=True, seed=0)
for i in range(5):
  print(z.eval())

Caveat: in general, there are other sources in randomness in the training scripts, so you have to set also pure python seed (random.seed) and numpy seed (numpy.random.seed).

Maxim
  • 52,561
  • 27
  • 155
  • 209
  • 1
    Perfect, worked like a charm! I hoped it would be this simple. – rdv Mar 08 '18 at 17:51
  • 2
    Setting both graph seed and dropout seedd won't force the same inputs always to be dropped out at each iteration? – Xema Sep 12 '18 at 10:09
  • @Xema did you ever find the answer? Does seeding the dropout layer ensure that the same masks will be randomly sampled? Maxim, would you know? – Rylan Schaeffer Nov 20 '22 at 06:45
  • Didn't even remember the question! I moved to pytorch at that time, and never looked back. – Xema Dec 17 '22 at 14:08