I have 2 tensors,
their format currently is [13, 2] respectively. I am trying to combine both into a 3 dimensional tensor with dimensions [2, 13, 2] such that they stack on top of each other, however are separated as batches.
here is an example of one of the tensors in format [13, 2]:
tensor([[[-1.8588, 0.3776],
[ 0.1683, 0.2457],
[-1.2740, 0.5683],
[-1.7262, 0.4350],
[-1.7262, 0.4350],
[ 0.1683, 0.2457],
[-1.0160, 0.5940],
[-1.3354, 0.5565],
[-0.7497, 0.5792],
[-0.2024, 0.4251],
[ 1.0791, -0.2770],
[ 0.3032, 0.1706],
[ 0.8681, -0.1607]])
I would like to maintain the shape, but have them in 2 groups in the same tensor. below is an example of the format I am after:
tensor([[[-1.8588, 0.3776],
[ 0.1683, 0.2457],
[-1.2740, 0.5683],
[-1.7262, 0.4350],
[-1.7262, 0.4350],
[ 0.1683, 0.2457],
[-1.0160, 0.5940],
[-1.3354, 0.5565],
[-0.7497, 0.5792],
[-0.2024, 0.4251],
[ 1.0791, -0.2770],
[ 0.3032, 0.1706],
[ 0.8681, -0.1607]],
[[-1.8588, 0.3776],
[ 0.1683, 0.2457],
[-1.2740, 0.5683],
[-1.7262, 0.4350],
[-1.7262, 0.4350],
[ 0.1683, 0.2457],
[-1.0160, 0.5940],
[-1.3354, 0.5565],
[-0.7497, 0.5792],
[-0.2024, 0.4251],
[ 1.0791, -0.2770],
[ 0.3032, 0.1706],
[ 0.8681, -0.1607]]])
Anyone have any ideas on how to do this using concatination? I have tried using .unsqueeze when using torch.cat((a, b.unsqueeze(0)), dim=-1)
however it changed the format to [13, 4, 1] which is not the format I am after.
solution below works, however, my idea was that I would keep stacking to y via a loop without being restricted by the shape. Sorry for not projecting my idea clearly enough.
They will all be of size [13,2] so it will go up in the form of [1,13,2], [2,13,2], [3,13,2], [4,13,2] etc...