It's possible to customize output layer's activation function for tensorflow's DNNClassifier canned estimator but no documentation about hidden layer's default activation function... Is it a ReLu one ? Is it customizable ?
Asked
Active
Viewed 96 times
0
-
Is it not only output layer ?? – GerardL Nov 05 '19 at 09:41
1 Answers
0
You can modify the hidden layer's activation function by assigning new activation function to the activation_fn
argument when you instantiate DNNClassifier
.
classifier = tf.estimator.DNNClassifier(feature_columns=feature_columns,
hidden_units=[10, 20, 10],
activation_fn=tf.nn.sigmoid, # default value is relu
n_classes=3,
model_dir="/tmp/model")

Kaushik Roy
- 1,627
- 2
- 11
- 13