EDIT: Please don't do what I've done below. Having used GradCAM to look at what weights are being used in the first convolutional layer, it seems that the input_tensors argument doesn't have any impact and all clones of the base_model have the same weights despite the change in inputs.
In my case, I am using tf.keras.models.clone_model to clone the base pre-trained neural network so that all of my multiple inputs have their own path:
# inputs:
x1 = tf.keras.layers.Input(shape=(None, None, 3), name="x1")
x2 = tf.keras.layers.Input(shape=(None, None, 3), name="x2")
x3 = tf.keras.layers.Input(shape=(None, None, 3), name="x3")
# load base model:
base_model = tf.keras.applications.DenseNet169(input_tensor=x1, input_shape=(224, 224, 3), include_top=False, pooling='avg')
# create copies:
base_model2 = tf.keras.models.clone_model(base_model, input_tensors=x2)
base_model3 = tf.keras.models.clone_model(base_model, input_tensors=x3)
# you have to rename the layers in each model so there aren't any conflicts:
cnt = 0
for mod in [base_model1, base_model2, base_model3, base_model4, base_model5, base_model6]:
cnt += 1
for layer in mod.layers:
old_name = layer.name
layer._name = f"base_model{cnt}_{old_name}"
# this bit isn't necessary unless you want to access weights easily later on:
base1_out = base_model.output
base2_out = base_model2.output
base3_out = base_model3.output
# concatenate the outputs:
concatenated = tf.keras.layers.concatenate([base1_out, base2_out, base3_out], axis=-1)
# add dense layers if you want:
concat_dense = tf.keras.layers.Dense(2048)(concatenated)
out = tf.keras.layers.Dense(class_count, activation='softmax')(concat_dense)
tf.keras.models.Model(inputs=[x1, x2, x3], outputs=[out])
Note that my inputs (in the form of a dictionary) come from symbolic tensors created with TensorFlow's tf.data.Dataset.