I am implementing a custom keras layer. The call method in my class is as follows.
def call(self, inputs, mask=None):
if type(inputs) is not list or len(inputs) <= 1:
raise Exception('Merge must be called on a list of tensors '
'(at least 2). Got: ' + str(inputs))
e1 = inputs[0]
e2 = inputs[1]
f = K.transpose((K.batch_dot(e1, K.dot(e2, self.W), axes=1))) #Removing K.transpose also works, why?
return f
I verfied and the code works but I am trying to find ways to better debug when implementing a custom layer in keras. Assuming e1 and e2 are (batch_size * d) and W is (d*d) How can I find the dimensions of each subpart of my expression? For eg. K.dot(e2, self.W), the result of batch_dot etc.