3

I am trying to do the exhaustive concatenation between the tensors. So, for example, I have tensor:

a = torch.randn(3, 512)

I want to concatenate like concat(t1,t1),concat(t1,t2), concat(t1,t3), concat(t2,t1), concat(t2,t2)....

As a naive solution, I have used for loop:

ans = []
result = []
split = torch.split(a, [1, 1, 1], dim=0)

for i in range(len(split)):
    ans.append(split[i])

for t1 in ans:
    for t2 in ans:
        result.append(torch.cat((t1,t2), dim=1))

The issue is that each epoch is taking very long time and the code is slow. I tried the solution posted in question on PyTorch: How to implement attention for graph attention layer but this gives a memory error.

t1 = a.repeat(1, a.shape[0]).view(a.shape[0] * a.shape[0], -1)
t2 = a.repeat(a.shape[0], 1)
result.append(torch.cat((t1, t2), dim=1))

I am sure there is a faster way, but I was unable to figure it out.

amy
  • 342
  • 1
  • 5
  • 18
  • What is the context? What are you going to do with the `result`? Right now, you're creating 9 1x1024 matrices, which requires a plenty of memory - if in your actual usecase you're concatenating even more sub-tensors, your memory consumption grows quadratically - it's not surprising this is slow. I don't see how the answer from attention for graph layers relates to your problem. Can you elaborate? – Jatentaki Jan 14 '19 at 19:34
  • @Jatentaki I am trying to concatenate the tensors with all other tensors actually. That is all what I want to achieve. I know it is slow. I was just trying to say that maybe converting the tensors using `repeat` might help as illustrated in other post of stackoverflow. But it is still memory error. Is there any simpler way to achiever concatenation. `result` will be converted to tensor and will be passed through linear layer. – amy Jan 14 '19 at 20:58
  • Can you post the code which results in memory error so we can reproduce? – Jatentaki Jan 14 '19 at 23:18
  • @Jatentaki Edited. Hope this helps. – amy Jan 15 '19 at 01:53
  • @Jatentaki The code which gives the memory error also consumes lot of memory and it grows quadratically. But I am not sure how to do this any other way. – amy Jan 15 '19 at 15:29
  • This does not result in memory errors on my machine. If by memory error you mean that you're running out of memory, perhaps you just have to look for a different approach to what you're trying to do. – Jatentaki Jan 16 '19 at 09:07

0 Answers0