New answer, considering layer2 as (50,49)
Here, you want a scalar multiplication for each line in layer2. Then we're going to consider the "50" as part of the batch, and actually make multiplications of shapes (1,1) with shapes (49,1). And to keep the 50 separate in the batch_dot
, we will reshape things inside the lambda function using -1
as a wildcard:
out = Lambda(myMultiplication, output_shape=(50,49))([layer1,layer2])
Where
import keras.backend as K
def myMultiplication(x):
#inside lambda functions, there is an aditional axis, the batch axis. Normally, we use -1 for this dimension. We can take advantage of it and simply hide the unwanted 50 inside this -1.
L1 = K.reshape(x[0], (-1,1,1))
L2 = K.reshape(x[1], (-1,49,1))
result = K.batch_dot(L1,L2, axes=[1,2])
#here, we bring the 50 out again, keeping the batch dimension as it was originally
return K.reshape(result,(-1,50,49))
Old answer, when I supposed layer2 was (49,) instead of (50,49)
You need a lambda layer (for custom functions) with a batch_dot
.
Batch dot is an actual matrix multiplication, while multiply is elementwise multiplication. For that, you should reshape your vectors to matrices, being one of them transposed in order to achieve the multiplication you want.
So:
layer1 = Reshape((1,50))(layer1)
layer2 = Reshape((49,1))(layer2)
out = Lambda(myMultiplication, output_shape=(50,49))([layer1,layer2])
Where
import keras.backend as K
def myMultiplication(x):
return K.batch_dot(x[0],x[1],axes=[1,2])