I have read this article, https://www.tensorflow.org/api_docs/python/tf/keras/layers/Dropout. Dropout will help prevent overfitting by make non-active neutron in ANN. But the next question...
why we must dropout neutron since we can adjust how much neutron in ANN?. For example, what is different of this code?.
FIRST
model = tf.keras.models.Sequential([
tf.keras.layers.Flatten(input_shape=(28, 28)),
tf.keras.layers.Dense(100, activation='relu'),
tf.keras.layers.Dropout(0.2),
tf.keras.layers.Dense(10)
])
SECOND
model = tf.keras.models.Sequential([
tf.keras.layers.Flatten(input_shape=(28, 28)),
tf.keras.layers.Dense(80, activation='relu'),
tf.keras.layers.Dense(10)
])
We use 80 neutron instead of 100 which will dropout 20 of that neutron