0

I am trying to add a custom loss function for variational autoencoder. Along with the reconstruction loss, KL divergence I wish to add a loss based on the difference between hamming distances of pairs of input and outputs.

But the problem what I am having is that with or without this extra loss, the results are same. Could anyone point out what I should be doing to correct it? Is it something to do with dimensions or something else.

Here is my code snippet:

def ham_loss(y_true,y_pred):
    # calculate pairwise hamming distance matrix
    # differences of y_pred probabilities)
    pairwise_diff_pred = K.expand_dims(y_pred, 0) - K.expand_dims(y_pred, 1)
    pairwise_distance_pred = K.sum(pairwise_diff_pred, axis=-1)

    # calculate pairwise hamming distance matrix for inputs
    pairwise_diff_true = K.expand_dims(y_true, 0) - K.expand_dims(y_true, 1)
    pairwise_distance_true = K.sum(pairwise_diff_true, axis=-1)

    #Difference between the distances of y_true and y_predictions
    hamm_sum= Lambda(differences)([pairwise_distance_true, pairwise_distance_pred])
    print(hamm_sum)
    return K.sum(hamm_sum, axis=-1)

def vae_loss(y_true, y_pred):
    """ Calculate loss = reconstruction loss + KL loss for each data in minibatch """
         # E[log P(X|z)]

         recon = K.sum(K.binary_crossentropy(y_true,y_pred),axis=1)
         # D_KL(Q(z|X) || P(z|X)); calculate in closed form as both dist. are Gaussian
         kl = 0.5 * K.sum(K.exp(z_log_var) + K.square(z_mean) - 1. - z_log_var, axis=1)

         hamming_loss = ham_loss(y_true,y_pred)

         return recon + kl + hamming_loss

Any help much appreciated!

Thanks in advance..

Matthieu Brucher
  • 21,634
  • 7
  • 38
  • 62
Anil Gaddam
  • 9
  • 1
  • 5

0 Answers0