1

I’m a beginner in Keras. I meet a simple problem, but I have searched for a long time and cannot find how to deal with it.

In briefly, I have 2 vectors: [a1, a2, a3] & [b1, b2, b3], and I want to combine them by giving their weights, using the following formula to get the new vector: [y1, y2, y3]. Then how to implement this with Keras?

formula

My situation is that: I build 2 models separately, and each one has its prediction output (i.e. 3 values representing the predicted value of 3 categories). The output of the first model is [a1, a2, a3], and for the second model is [b1, b2, b3].

Now I want to merge these 2 outputs to get new prediction results [y1, y2, y3], so y1 is the combination of a1 & b1, and the model should learn the weight by itself. It means that y1 = w1*a1 + w4*b1, and w1 and w4 are the weights to be trained. Similarly, y2 = w2*a2 + w5*b2, y3 = w3*a3 + w6*b3, so I have 6 weights to be trained. It is like element-wise multiplication, but the numbers being multiplication are the weights to be trained. I have tried the following codes, but I find that it’s not what I want.

# output_a & output_b are size (3,1)
merge = concatenate([output_a, output_b], axis=2)
# merge is (3,2)
output_y = Dense(1, use_bias=False)(merge)
# output_y is (3,1)

I find that in the Dense layer, the number of training parameter is 2. I think it means that it just has 2 weights (i.e. w1=w2=w3, and w4=w5=w6). I’m not sure how to fix it. Thank you all for helping me!

Edit: Someone told me that maybe I should define the layer I want by myself. So isn't there any function I can use to achieve that?

Edit: I add a figure below and I think it can describe my question.

A figure that describes my question

Doraemon
  • 11
  • 3

1 Answers1

1
from keras.layers import Input, Dense,multiply,add
from keras.models import Model

output_a = Input(shape=(3,))
output_b = Input(shape=(3,))
weights_a = Dense(3,use_bias=False)(output_a)
weights_b = Dense(3,use_bias=False)(output_b)
a_weighted = multiply([output_a,weights_a])
b_weighted = multiply([output_b,weights_b])
output_y = add([a_weighted,b_weighted])

weights learned from self input

Birol Kuyumcu
  • 1,195
  • 7
  • 17
  • Thanks for responding. But I think in your codes, there're 3*3+3*3=18 weights to be trained, right? I want to assign only 6 weights. I have edited my post and added a figure to describe my question. Kindly have a look at the picture, thanks! – Doraemon May 24 '19 at 08:13
  • Dense Layer produce a 3 weight values – Birol Kuyumcu May 24 '19 at 08:42
  • Do you mean that I can create the "fictitious" weights, weights_a & weights_b, to be the training weights of vector a & b? I'm thinking whether it is equivalent to my problem. For example, actually each element in weights_a (`w1`, `w2` & `w3`) are calculated from `a1`, `a2` & `a3`, although it will just times its corresponding value to get the final prediction result. (e.g. `w1` times `a1` to get `y1`) – Doraemon May 24 '19 at 09:03
  • The weights are learned from training data. For instance: `a = Input(shape=(3,))` `b = Dense(3, use_bias=False)(a)` `model = Model(inputs=a, outputs=b)` It means that I can use training data to train the weights of fully-connected layer between `a` and `b` to get the best prediction result. And it has nine weights to learned by itself. The weights are from the connections between the neurons, so I don't have to create the "weight" like `weights_a`. – Doraemon May 26 '19 at 07:02
  • I just find the solution of my question in other post here: [Custom connections between layers Keras](https://stackoverflow.com/questions/47265412/custom-connections-between-layers-keras) It describes how to customize the connections (weights) between the layers. – Doraemon May 27 '19 at 11:41