from example_data below I need sum_product(x,y)/sum(y) - having x & y as Input... probably this part of model can even be trainable=False, but anyway, is there a simplier way to do such calculation (either from 1 tensor at all or at least from such separate tensors for vars & weights) ? If there could be more beautiful Graph for such Task, than I've created ? I could have written Only such long code (for such a simple thing)
import numpy as np
import tensorflow as tf
from keras import backend as K
x= np.array([[1100, 1200, 1300, 1400]] ) # vals
y= np.array([[10, 50, 30, 5]] ) # weights
inpS= tf.keras.layers.Input(shape=(4,), batch_size=1, name='inp1', dtype='float32')
inpW= tf.keras.layers.Input(shape=(4,), batch_size=1, name='inp2', dtype='float32')
dot_product = tf.keras.layers.Dot(axes=1, normalize=False, trainable=False)([inpS, inpW])
wsum = tf.keras.layers.Lambda( lambda z: K.sum(z, axis=1, keepdims=True))(inpW)
con= tf.keras.layers.Concatenate(axis=-1)([dot_product, wsum]) #for Multiple input into Lambda layer
wa = tf.keras.layers.Lambda(lambda x: x[0][0]/x[0][1])(con)
model = tf.keras.Model([inpS, inpW], wa)
model.predict([x,y])
RES should be: 117000/95=1231.5789794921875