8

I want to train a model with a shared layer in the following form:

x --> F(x)
          ==> G(F(x),F(y))
y --> F(y) 

x and y are two separate input layers and F is a shared layer. G is the last layer after concatenating F(x) and F(y).

Is it possible to model this in Keras? How?

today
  • 32,602
  • 8
  • 95
  • 115
Amirhessam
  • 154
  • 1
  • 11

1 Answers1

11

You can use Keras functional API for this purpose:

from keras.layers import Input, concatenate

x = Input(shape=...)
y = Input(shape=...)

shared_layer = MySharedLayer(...)
out_x = shared_layer(x)
out_y = shared_layer(y)

concat = concatenate([out_x, out_y])

# pass concat to other layers ...

Note that x and y could be the output tensors of any layer and not necessarily input layers.

today
  • 32,602
  • 8
  • 95
  • 115
  • 1
    are both out_x , out_y using the same weights? – Amirhessam Aug 27 '18 at 16:43
  • 1
    @Amirhessam They are output of the same layer with same weights, though given different inputs (i.e. `x` and `y`). – today Aug 27 '18 at 17:38
  • 1
    is it necessary to use the concatenate operation here? can't out_x be given to another layer? – vampiretap Dec 03 '18 at 18:22
  • 1
    @vampiretap Sure, you can feed `out_x` to different layers. The concatenation layer is used here because the OP explicitly mentioned this in their question: "G is the last layer after concatenating F(x) and F(y)". – today Dec 03 '18 at 18:45
  • What happens during learning? Is it well-documented? – Yan King Yin Jul 07 '20 at 07:00