0

It's possible to customize output layer's activation function for tensorflow's DNNClassifier canned estimator but no documentation about hidden layer's default activation function... Is it a ReLu one ? Is it customizable ?

GerardL
  • 81
  • 7

1 Answers1

0

You can modify the hidden layer's activation function by assigning new activation function to the activation_fn argument when you instantiate DNNClassifier.

classifier = tf.estimator.DNNClassifier(feature_columns=feature_columns,
                                        hidden_units=[10, 20, 10],
                                        activation_fn=tf.nn.sigmoid,  # default value is relu
                                        n_classes=3,
                                        model_dir="/tmp/model")
Kaushik Roy
  • 1,627
  • 2
  • 11
  • 13