0

I am trying to compile a model with 2 outputs using a custom loss function but I am failing at doing so. Any ideas? Let me show you what I have done,

Here is the loss function:

def contrastive_loss(y_true, y_pred1, y_pred2):
    '''Contrastive loss from Hadsell-et-al.'06
    http://yann.lecun.com/exdb/publis/pdf/hadsell-chopra-lecun-06.pdf
    '''
    euclidean_distance = pairwise_dist(y_pred1, y_pred2)
    loss_contrastive = K.mean((1-y_true) * tf.pow(euclidean_distance, 2) + 
                                  (y_true) * tf.pow(tf.clip_by_value(2.0 - euclidean_distance, 0.0, 2.0), 2))

    return loss_contrastive

I tried this:

optimizer = Adam(lr = 0.00006)
model.compile(loss=[lambda y_true,y_pred: contrastive_loss(y_true, y_pred[0], y_pred[1])],optimizer=optimizer)

But I get this error:

---------------------------------------------------------------------------
ValueError                                Traceback (most recent call last)
<ipython-input-27-b31099307b2d> in <module>
     15 [lambda y_true,y_pred: Custom_loss(y_true, y_pred, val=0.01)]
     16 
---> 17 model.compile(loss=[lambda y_true,y_pred: contrastive_loss(y_true, y_pred[0], y_pred[1])],optimizer=optimizer)
     18 
     19 print("Starting training process!")

C:\ProgramData\Anaconda3\lib\site-packages\keras\backend\tensorflow_backend.py in symbolic_fn_wrapper(*args, **kwargs)
     73         if _SYMBOLIC_SCOPE.value:
     74             with get_graph().as_default():
---> 75                 return func(*args, **kwargs)
     76         else:
     77             return func(*args, **kwargs)

C:\ProgramData\Anaconda3\lib\site-packages\keras\engine\training.py in compile(self, optimizer, loss, metrics, loss_weights, sample_weight_mode, weighted_metrics, target_tensors, **kwargs)
    117         # Prepare list of loss functions, same size as model outputs.
    118         self.loss_functions = training_utils.prepare_loss_functions(
--> 119             self.loss, self.output_names)
    120 
    121         self._feed_outputs = []

C:\ProgramData\Anaconda3\lib\site-packages\keras\engine\training_utils.py in prepare_loss_functions(loss, output_names)
    825             raise ValueError('When passing a list as loss, it should have one entry '
    826                              'per model outputs. The model has {} outputs, but you '
--> 827                              'passed loss={}'.format(len(output_names), loss))
    828         loss_functions = [get_loss_function(l) for l in loss]
    829     else:

ValueError: When passing a list as loss, it should have one entry per model outputs. The model has 2 outputs, but you passed loss=[<function <lambda> at 0x0000000041EFCB88>]

How to resolve this?

mj1261829
  • 1,200
  • 3
  • 26
  • 53
  • your loss is a list containing a (lambda) function. try it without the `[]` around the lambda – Nullman Mar 03 '20 at 13:33
  • not working, I am getting another error – mj1261829 Mar 03 '20 at 13:36
  • which one? also, which model are you using? – Nullman Mar 03 '20 at 13:37
  • i am implementing a siamese model with 2 outputs. the error is currently in the loss function itself although I am using it without problems. Is there another way or the functionality to do loss function on multiple outputs is still unavailable? – mj1261829 Mar 03 '20 at 13:44
  • what error are you getting though? – Nullman Mar 03 '20 at 13:47
  • ---> 12 na = tf.reduce_sum(tf.square(A), 1).........................................ValueError: Invalid reduction dimension 1 for input with 1 dimensions. for 'loss_5/sequential_2_loss/lambda/Sum' (op: 'Sum') with input shapes: [4096], [] and with computed input tensors: input[1] = <1>. – mj1261829 Mar 03 '20 at 13:50

1 Answers1

0

If the two preds have the same shape, join them in one output:

final_output = Lambda(lambda x: tf.stack(x, axis=0))([output1, output2])

In your loss, you unstack them:

def contrastive_loss(y_true, y_pred):
    y_pred1 = y_pred[0]
    y_pred2 = y_pred[1]

    '''Contrastive loss from Hadsell-et-al.'06
    http://yann.lecun.com/exdb/publis/pdf/hadsell-chopra-lecun-06.pdf
    '''
    euclidean_distance = pairwise_dist(y_pred1, y_pred2)
    loss_contrastive = K.mean((1-y_true) * tf.pow(euclidean_distance, 2) + 
                                (y_true) * tf.pow(tf.clip_by_value(
                                                     2.0 - euclidean_distance, 0.0, 2.0), 
                              2))

    return loss_contrastive

If the two preds have different shapes, go here: Keras: Custom loss function with training data not directly related to model

Daniel Möller
  • 84,878
  • 18
  • 192
  • 214