The default beta_initializer
for layers.batch_normalization
is: tf.zeros_initializer()
.
Is it possible to create a new initializer with an arbitrary value?
The default beta_initializer
for layers.batch_normalization
is: tf.zeros_initializer()
.
Is it possible to create a new initializer with an arbitrary value?
See the list of built-in initializers. The one that interests you is tf.constant_initializer
:
Initializer that generates tensors with constant values.