7

I have a tensor inps, which has a size of [64, 161, 1] and I have some new data d which has a size of [64, 161]. How can I add d to inps such that the new size is [64, 161, 2]?

Shamoon
  • 41,293
  • 91
  • 306
  • 570
  • 1
    As a side note, I could not find an answer that actually goes about the problem in exactly the same way, although this is a very common issue. Feel free to link to a duplicate question if you can find one. – dennlinger Apr 08 '20 at 14:25

3 Answers3

10

There is a cleaner way by using .unsqueeze() and torch.cat(), which makes direct use of the PyTorch interface:

import torch

# create two sample vectors
inps = torch.randn([64, 161, 1])
d = torch.randn([64, 161])

# bring d into the same format, and then concatenate tensors
new_inps = torch.cat((inps, d.unsqueeze(2)), dim=-1)
print(new_inps.shape)  # [64, 161, 2]

Essentially, unsqueezing the second dimension already brings the two tensors into the same shape; you just have to be careful to unsqueeze along the right dimension. Similarly, the concatenation is unfortunately named differently from the otherwise similarly named NumPy function, but behave the same. Note that instead of letting torch.cat figure out the dimension by providing dim=-1, you can also explicitly provide the dimension to concatenate along, in this case by replacing it with dim=2.

Keep in mind the difference between concatenation and stacking, which is helpful for similar problems with tensor dimensions.

dennlinger
  • 9,890
  • 1
  • 42
  • 63
  • 3
    Changing `2` to `-1` would make it more generic if you know it is always the last dimension. Also, I don't know why, I prefer `d[..., None]` than using `unsqueeze` :) – Berriel Apr 08 '20 at 14:25
  • Good catch! I rarely use it myself (mostly for explicit clarity of what dimension is altered, but for the sake of generality, I'll edit the answer. – dennlinger Apr 08 '20 at 14:26
2

You have to first reshape d so that it has a third dimension along which concatenation becomes possible. After it has a third dimension and the two tensors have the same number of dimensions, then you can use torch.cat((inps, d),2) to stack them.

old_shape = tuple(d.shape)
new_shape = old_shape + (1,)
inps_new = torch.cat( (inps, d.view( new_shape ), 2)
Conor
  • 691
  • 5
  • 14
1

Alternatively, you can achieve this by squeezing the larger tensor and stacking:

inps = torch.randn([64, 161, 1])
d = torch.randn([64, 161])

res = torch.stack((inps.squeeze(), d), dim=-1)

res.shape
>>> [64, 161, 2]
iacob
  • 20,084
  • 6
  • 92
  • 119