Can someone please explain in simple terms and examples on how these work after performing the conv2d forward pass.
Let me add to this question - What is the difference between conv2d_backprop_filter and tf.nn.conv2d_backprop_input?
Can someone please explain in simple terms and examples on how these work after performing the conv2d forward pass.
Let me add to this question - What is the difference between conv2d_backprop_filter and tf.nn.conv2d_backprop_input?
For an explanation of conv2d_transpose I would look at other stack overflow questions such as this one: conv2d_transpose
As for conv2d_backprop_filter: this is what is computed during backpropagation to be passed to the previous layer. It has been used for things such as Deep Dream and creation of adversarial examples.
Plese see this answer for a detailed example of how tf.nn.conv2d_backprop_input
and tf.nn.conv2d_backprop_filter
in an example.
A short answer to your question:
In tf.nn
, there are 4 closely related 2d conv functions:
tf.nn.conv2d
tf.nn.conv2d_backprop_filter
tf.nn.conv2d_backprop_input
tf.nn.conv2d_transpose
Given out = conv2d(x, w)
and the output gradient d_out
:
tf.nn.conv2d_backprop_filter
to compute the filter gradient d_w
tf.nn.conv2d_backprop_input
to compute the filter gradient d_x
tf.nn.conv2d_backprop_input
can be implemented by tf.nn.conv2d_transpose
tf.nn.conv2d