0

I have the folowing models that i want to train (See image below):

enter image description here

The model has an input of 20. The model A has an input of 10 (the first 10 elements of the initial input), the model B has an input of 10 (the last 10 elements of the initial input) finally the input of the model C is the concatenation of the output of the models A and B.

How can I train this 3 models at the same time in Keras? Can I merge it in one big model? (I only have data to train the big model)

krimo
  • 666
  • 2
  • 8
  • 27
kithuto
  • 455
  • 2
  • 11

2 Answers2

0

Can I merge it in one big model?

Yes!

How can I train this 3 models at the same time in Keras?

I will give you pointers:

  1. Use functional APIs. Want to know how it is different from sequential? Look here
  2. Use concatenate layer - Reference
paradocslover
  • 2,932
  • 3
  • 18
  • 44
0

Lets assume that you have your three models defined, and named model_A, model_B and model_C. You can now define you complete model somewhat like this (I did not check the exact code):

def complete_model(model_A, model_B, model_C):

    input_1 = layers.Input(shape=(10,))
    input_2 = layers.Input(shape=(10,))

    model_A_output = model_A(input_1)
    model_B_output = model_B(input_2)

    concatenated = tf.concat([model_A_output, model_B_output], axis=-1)
    model_C_output = model_C(concatenated)

    model = Model(inputs=[input_1, input_2], outputs=model_C_output)
    model.compile(loss=losses.MSE)
    model.summary()
    return model

This requires you to give two-dimensional inputs, so you have to do some numpy slicing to preprocess your inputs.

If you still want your one-dimensional inputs, you can just define a single input layer with shape (20,) and then use the tf.split function to split it in half and feed it into the next networks.

Marc Felix
  • 421
  • 3
  • 10