First of all, you can't put assignments in a list comprehension.
Second of all: you're in luck, because Vjunk
and V
readily broadcast when you subtract them. Here is an example with non-trivial shapes to make it easier to spot bugs:
import numpy as np
Vjunk = np.random.rand(2, 3, 4, 5)
V = np.random.rand(4, 5)
# naive version: loop
res1 = Vjunk.copy()
for i in range(2):
for j in range(3):
for l in range(4):
for m in range(5):
res1[i,j,l,m] -= V[l,m]
# vectorized, broadcasting version:
res2 = Vjunk - V
print(np.array_equal(res1, res2))
# True
Here Vjunk
has shape (2, 3, 4, 5)
, and V
has shape (4, 5)
. The latter is compatible with shape (1, 1, 4, 5)
for the purposes for broadcasting, which is then compatible with the shape of Vjunk
.
Performing the broadcasting subtraction Vjunk - V
will exactly do what you want: for each element along the last two dimensions each value of Vjunk
(a 2d array along its first two dimensions) will be decreased by V
.
It's then trivial to throw in a scalar factor:
res = Vjunk - beta * V