2

Can someone please explain in simple terms and examples on how these work after performing the conv2d forward pass.

Let me add to this question - What is the difference between conv2d_backprop_filter and tf.nn.conv2d_backprop_input?

  • have you done any research from your end? show your findings? – Coder Mar 15 '17 at 19:20
  • My understanding so far is that conv2d_transpose performs deconvolution operation which kind of returns original image back (with loss). I do not understand its difference to the backprop_filter. I am new to Tensorflow and want to get a deeper understanding of convolutions in Tensorflow – Srujana Gattupalli Mar 15 '17 at 19:26

2 Answers2

1

For an explanation of conv2d_transpose I would look at other stack overflow questions such as this one: conv2d_transpose

As for conv2d_backprop_filter: this is what is computed during backpropagation to be passed to the previous layer. It has been used for things such as Deep Dream and creation of adversarial examples.

Community
  • 1
  • 1
Steven
  • 5,134
  • 2
  • 27
  • 38
  • Ok so for me to perform a convolution and back prop should I use conv2d and then tf.nn.conv2d_backprop_filter or should I perform conv2d and then do tf.nn.conv2d_backprop_input? – Srujana Gattupalli Mar 15 '17 at 19:30
  • If you use one of the standard optimizers that tensorflow provides then you don't need either of those at all. You only need them if you specifcally need the gradients for some other purpose in your network like visualization, or you want to modify some particular gradients before applying them etc. Meaning you can just set up your network to perform convolution and allow tensorflow to perform the backpropagation on its own using one of the provided optimizers. – Steven Mar 15 '17 at 19:32
  • My goal is to time a convolution operation. A forward and backward pass through a convolution. So which one should I use onv2d_backprop_filter or tf.nn.conv2d_backprop_input? – Srujana Gattupalli Mar 15 '17 at 19:36
  • To do proper back propagation you would actually have to use both as one is used to update the previous layer and the other is used to update the filters. – Steven Mar 15 '17 at 21:39
  • Great! Thank you! – Srujana Gattupalli Mar 15 '17 at 22:00
  • If you don't update the filter then you're essentially taking in an image and adding random noise to it, then passing it onto the next layer. By updating the filter you are "learning" what is important about the input to be passed onto the next layer. Convolution itself is also a layer and the filter can be considered its weights. – Steven Mar 16 '17 at 18:30
0

Plese see this answer for a detailed example of how tf.nn.conv2d_backprop_input and tf.nn.conv2d_backprop_filter in an example.

A short answer to your question:

In tf.nn, there are 4 closely related 2d conv functions:

  • tf.nn.conv2d
  • tf.nn.conv2d_backprop_filter
  • tf.nn.conv2d_backprop_input
  • tf.nn.conv2d_transpose

Given out = conv2d(x, w) and the output gradient d_out:

  • Use tf.nn.conv2d_backprop_filter to compute the filter gradient d_w
  • Use tf.nn.conv2d_backprop_input to compute the filter gradient d_x
  • tf.nn.conv2d_backprop_input can be implemented by tf.nn.conv2d_transpose
  • All 4 functions above can be implemented by tf.nn.conv2d
  • Actually, use TF's autodiff is the fastest way to compute gradients
Yixing Lao
  • 1,198
  • 17
  • 29