1
  1. How can I define my own loss function which required Weight and Bias parameters from previous layers in Keras?

  2. How can I get [W1, b1, W2, b2, Wout, bout] from every layer? Here, we need to pass few more variable than usual (y_true, y_pred). I have attached two images for your reference.

I need to implement this loss function. enter image description here

enter image description here

1 Answers1

0

To answer your second part, I used the following code to get the norm of every layer in my model for visualization purposes:

for layer in model.layers:
    if('Convolution' in str(type(layer))):
        i+=1
        layer_weight = []
        for feature_map in layer.get_weights()[0]:
            layer_weight.append(linalg.norm(feature_map) / np.sqrt(np.prod(feature_map.shape)))
        l_weights.append((np.sum(layer_weight)/len(layer_weight), layer.name, i))
        weight_per_layer.append(np.sum(layer_weight)/len(layer_weight))
        conv_weights.append(layer_weight)

Now to use this in a loss function I would try something like this:

def get_loss_function(weights):
   def loss(y_pred, y_true):
       return (y_pred - y_true) * weights # or whatever your loss function should be
   return loss
model.compile(loss=get_loss_function(conv_weights), optimizer=SGD(lr=0.1))
Thomas Pinetz
  • 6,948
  • 2
  • 27
  • 46
  • One more question Say, for example, I am using l2_regularization Tensorflow has - tf.nn.l2_loss Can I use this? 1. K.sum(K.square(K.abs(Weights))) 2. tf.nn.l2_loss Can I use these interchangeably? – Ayan Kumar Bhunia Jul 18 '17 at 10:55
  • I am pretty sure that the l2 regularization is stored in the layer of the model and can be extracted using the layer interface. – Thomas Pinetz Jul 18 '17 at 19:53