5

I have following four tensors

  1. H (h, r)
  2. A (a, r)
  3. D (d, r)
  4. T (a, t, r)

For each i in a, there is a corresponding T[i] of the shape (t, r).

I need to do a np.einsum to produce the following result (pred):

pred = np.einsum('hr, ar, dr, tr ->hadt', H, A, D, T[0])
for i in range(a):
    pred[:, i:i+1, :, :] = np.einsum('hr, ar, dr, tr ->HADT', H, A[i:i+1], D, T[i])

However, I want to do this computation without using a for loop. The reason is that I' m using autograd which doesn't currently work with item assignments!

Nipun Batra
  • 11,007
  • 11
  • 52
  • 77

1 Answers1

3

One way would be using all the dimensions for T -

np.einsum('Hr, Ar, Dr, ATr ->HADT', H, A, D, T)

Since, we need to sum-reduce axis-r across all inputs, while keeping all others (axes) in the output, I don't see any intermediate way of doing it/bringing in any dot-based tools on this to leverage BLAS.

Brad Solomon
  • 38,521
  • 31
  • 149
  • 235
Divakar
  • 218,885
  • 19
  • 262
  • 358
  • Thanks! I somehow (wrongly) thought that it might be a bad practice to have the same characters repeated in the einsum string. Like, we have two As on the LHS of `->`. I think this restriction applies to the RHS of `->`? – Nipun Batra Nov 30 '17 at 21:31
  • @NipunBatra Repetition on LHS for different variables means keep them aligned. Repetition for one variable is not allowed or LHS or RHS for obvious reasons. – Divakar Nov 30 '17 at 21:33
  • 'ii' is the trace, 'ii->i' is the diagonal. 'i,i' and 'i,i->i' are also ok. Repeating the '->ii' in the output is wrong. – hpaulj Nov 30 '17 at 22:04
  • @hpaulj Ah yes for the special case of same length axes. Thanks for the reminder! – Divakar Nov 30 '17 at 22:08