0

I am attempting to use tf.nn.conv3d_transpose, however, I am getting an error indicating that my filter and output shape is not compatible.

  • I have a tensor of size [1,16,16,4,192]
  • I am attempting to use a filter of [1,1,1,192,192]
  • I believe that the output shape would be [1,16,16,4,192]
  • I am using "same" padding and a stride of 1.

Eventually, I want to have an output shape of [1,32,32,7,"does not matter"], but I am attempting to get a simple case to work first.

Since these tensors are compatible in a regular convolution, I believed that the opposite, a deconvolution, would also be possible.

Why is it not possible to perform a deconvolution on these tensors. Could I get an example of a valid filter size and output shape for a deconvolution on a tensor of shape [1,16,16,4,192]

Thank you.

Devin Haslam
  • 747
  • 2
  • 12
  • 34

1 Answers1

1
  • I have a tensor of size [1,16,16,4,192]
  • I am attempting to use a filter of [1,1,1,192,192]
  • I believe that the output shape would be [1,16,16,4,192]
  • I am using "same" padding and a stride of 1.

Yes the output shape will be [1,16,16,4,192]

Here is a simple example showing that the dimensions are compatible:

import tensorflow as tf

i = tf.Variable(tf.constant(1., shape=[1, 16, 16, 4, 192]))

w = tf.Variable(tf.constant(1., shape=[1, 1, 1, 192, 192]))

o = tf.nn.conv3d_transpose(i, w, [1, 16, 16, 4, 192], strides=[1, 1, 1, 1, 1])

print(o.get_shape())

There must be some other problem in your implementation than the dimensions.

BlueSun
  • 3,541
  • 1
  • 18
  • 37
  • 1
    Thank you for your help. This information helped me solve my problem almost instantly. My problem was that I was calling the conv3d_transpose like this: o = tf.nn.conv3d_transpose(i, [1, 1, 1, 192, 192], [1, 16, 16, 4, 192], strides=[1, 1, 1, 1, 1]). Very dumb problem, but I am new to tensorflow and never would have recognized this issue myself. I can't thank you enough. – Devin Haslam Oct 13 '17 at 18:42
  • 1
    Using the implementation that you provided here, I have run into some problems. Sometimes I have a filter size that has a large number of channels. For example [16,16,7,3298], also described here: https://stackoverflow.com/questions/46955515/initializing-a-large-tf-variable-produces-an-error. Some people have suggested that I split up the filter size ("w" in your implementation) into smaller tensors. Does that make sense to do here, or does it not make sense to have a filter with 3298 channels in the first place? Also do you ever do hired tutoring/help? – Devin Haslam Oct 26 '17 at 15:47
  • 2
    @DevinHaslam I would definitely not use 3298 channels in the first place. Why did you want to use so many? I never did hired tutoring and I don't really have time for it, sorry. – BlueSun Oct 29 '17 at 22:18
  • 1
    This is the architecture that I am replicating: i.stack.imgur.com/1qLP2.png As you can see there are many layers and several indications of concatenation (the orange arrows are shortcuts for residual learning). The reason that there are a large amount of channels is due to the concatenation I am doing. I use tf.concat, I concatenate along the axis of the channels. Does this sound correct? Is there another way to concatenate that wouldnt make my channels so large? Thanks! – Devin Haslam Oct 30 '17 at 13:21
  • 1
    In the image I provided, the number of feature maps can be seen in parentheses. I have constructed my architecture with the assumption that feature maps are the same as channels. Is this a correct assumption? – Devin Haslam Oct 30 '17 at 15:21
  • 2
    @DevinHaslam the number of features is the same as the number of channels. But residual connections (as defined in https://arxiv.org/pdf/1512.03385.pdf) are not concatenations, but a sum of the skipped and not skipped channels. However the channels of orange lines don't match in inception B and C. The channel size in the beginning and end of an orange line must be the same to make their addition possible. Either there are wrong numbers in the pictures or the orange lines are not residual connections. – BlueSun Oct 30 '17 at 16:20