While I'm study tensorflow, I got a question.
There are two way to define activation function.
activation = 'relu' and activation = tf.nn.relu
I want to know difference between of them.
(Actually, I think other activation functions are include in this case.)
I tried two way.
First one is
model = tf.keras.models.Sequential([
tf.keras.layers.Flatten(input_shape = (28, 28)),
tf.keras.layers.Dense(128, activation = 'relu'),
tf.keras.layers.Dense(10, activation = tf.nn.softmax)
])
Second one is
model = tf.keras.models.Sequential([
tf.keras.layers.Flatten(input_shape = (28, 28)),
tf.keras.layers.Dense(128, activation = tf.nn.relu),
tf.keras.layers.Dense(10, activation = tf.nn.softmax)
])
I think they gave me same result.
What is the difference of them?