I'm looking for a way in tensorflow to, given two inputs:
input1
, a 3D tensor of shape(batch_size, x, y)
input2
, a 1D tensor of shape(batch_size,)
whose values are all in the range[0, y - 1]
(inclusive).
return a 2D tensor of shape (batch_size, x)
such that the ith
element in the output is equal to the input2[i]-th
column of the ith
element in input1
.
Example:
If input1 = [[[1,2], [3,4]], [[5,6], [7,8]], [[9,10], [11,12]]]
(so shape of input1
is (3, 2, 2)
)
and
input2 = [0, 1, 1]
,
then the output I want is [[1,3], [6,8], [10,12]]
.
Explanation: The 0th element in the output is [1,3]
because the 0th element in input2
is 0; so, it becomes the 0th column in the 0th element of input1
. The last element in the output is [6,8]
, because the last element in input2
is 1; so, it becomes the 1st column in the last element of input1
.
Attempts:
I tried using tf.one_hot to accomplish this, (tf.reduce_sum(input1 * tf.one_hot(input2, y), 2)
) but Tensorflow became unhappy when doing the multiplication, saying "ValueError: Dimensions must be equal, but are 2 and 3 for 'mul' (op: 'Mul') with input shapes: [3,2,2], [3,2]."
Any help would be super appreciated, thanks!