How do I use torch.stack
to stack two tensors with shapes a.shape = (2, 3, 4)
and b.shape = (2, 3)
without an in-place operation?
Asked
Active
Viewed 6.6k times
30

Mateen Ulhaq
- 24,552
- 19
- 101
- 135

Suho Cho
- 575
- 1
- 7
- 14
3 Answers
36
Stacking requires same number of dimensions. One way would be to unsqueeze and stack. For example:
a.size() # 2, 3, 4
b.size() # 2, 3
b = torch.unsqueeze(b, dim=2) # 2, 3, 1
# torch.unsqueeze(b, dim=-1) does the same thing
torch.stack([a, b], dim=2) # 2, 3, 5

arjoonn
- 960
- 1
- 9
- 20
-
4What you want is to use [torch.cat](https://pytorch.org/docs/stable/torch.html#torch.cat) with `unsqueeze` as you've done. [torch.stack](https://pytorch.org/docs/stable/torch.html#torch.stack) creates a NEW dimension, and all provided tensors must be the same size. – drevicko Aug 28 '19 at 07:47
-
9This answer is incorrect with `torch.stack([a, b], dim=2)`, instead you want to use `torch.cat([a,b], dim=2)` as correctly mentioned by @drevicko. `torch.cat` concatenates the sequences in given dimension while `torch.stack` concatenates the sequences in a new dimension, as mentioned here: https://stackoverflow.com/questions/54307225/whats-the-difference-between-torch-stack-and-torch-cat-functions/54307331 . – warriorUSP May 10 '20 at 16:53
-
3This won't run. Instead you will receive 'RuntimeError: stack expects each tensor to be equal size, but got [2, 3, 4] at entry 0 and [2, 3, 1] at entry 1' – JP Zhang Aug 31 '20 at 21:18
20
Using pytorch 1.2 or 1.4 arjoonn's answer did not work for me.
Instead of torch.stack
I have used torch.cat
with pytorch 1.2 and 1.4:
>>> import torch
>>> a = torch.randn([2, 3, 4])
>>> b = torch.randn([2, 3])
>>> b = b.unsqueeze(dim=2)
>>> b.shape
torch.Size([2, 3, 1])
>>> torch.cat([a, b], dim=2).shape
torch.Size([2, 3, 5])
If you want to use torch.stack
the dimensions of the tensors have to be the same:
>>> a = torch.randn([2, 3, 4])
>>> b = torch.randn([2, 3, 4])
>>> torch.stack([a, b]).shape
torch.Size([2, 2, 3, 4])
Here is another example:
>>> t = torch.tensor([1, 1, 2])
>>> stacked = torch.stack([t, t, t], dim=0)
>>> t.shape, stacked.shape, stacked
(torch.Size([3]),
torch.Size([3, 3]),
tensor([[1, 1, 2],
[1, 1, 2],
[1, 1, 2]]))
With stack
you have the dim
parameter which lets you specify on which dimension you stack the tensors with equal dimensions.

Henry L
- 3
- 3

gil.fernandes
- 12,978
- 5
- 63
- 76
5
suppose you have two tensors a, b which are equal in dimensions i.e a ( A, B, C) so b (A, B , C) an example
a=torch.randn(2,3,4)
b=torch.randn(2,3,4)
print(a.size()) # 2, 3, 4
print(b.size()) # 2, 3, 4
f=torch.stack([a, b], dim=2) # 2, 3, 2, 4
f
it wont act if they wouldn't be the same dim. Be careful!!

pourya
- 67
- 1
- 3