2

The Keras documentation introduces separate classes for weight regularization and bias regularization. These can be subclasses to add a custom regularizer. An example from the Keras docs:

def my_regularizer(x):
    return 1e-3 * tf.reduce_sum(tf.square(x))

where x can be either the kernel weights or the bias weights. I however want to regularize my layer with a function that include both the layer weights and the layer bias. Is there a way that incorporates both of these into a single function?

For example I would like to have as regularizer:

def l1_special_reg(weight_matrix, bias_vector):
    return 0.01 * K.sum(K.abs(weight_matrix)-K.abs(bias_vector))

Thanks,

Bas Krahmer
  • 489
  • 5
  • 11
YoavEtzioni
  • 85
  • 10

2 Answers2

1

You can call layer[idx].trainable_weights, it will return both weights and bias. After that you can manually add that regularization loss in model loss function as follows:

model.layers[-1].trainable_weights

[<tf.Variable 'dense_2/kernel:0' shape=(100, 10) dtype=float32_ref>,
 <tf.Variable 'dense_2/bias:0' shape=(10,) dtype=float32_ref>]

Complete example with loss function:

# define model
def l1_reg(weight_matrix):
    return 0.01 * K.sum(K.abs(weight_matrix))

wts = model.layers[-1].trainable_weights # -1 for last dense layer.
reg_loss = l1_reg(wts[0]) + l1_reg(wts[1])

def custom_loss(reg_loss):
    def orig_loss(y_true, y_pred):
        return K.categorical_crossentropy(y_true, y_pred) + reg_loss
    return orig_loss

model.compile(loss=custom_loss(reg_loss),
              optimizer=keras.optimizers.Adadelta(),
              metrics=['accuracy'])
Ankish Bansal
  • 1,827
  • 3
  • 15
  • 25
  • Doesn't work in TF2 as of now: TypeError: An op outside of the function building code is being passed a "Graph" tensor. It is possible to have Graph tensors leak out of the function building context by including a tf.init_scope in your function building code. – Kristof Sep 01 '19 at 21:05
  • It may possible, i posted this code, by testing on tf1. What is the alternative in tf2 for this? – Ankish Bansal Sep 02 '19 at 13:09
1

In TensorFlow 2 this can be achieved with the model.add_loss() function. Say you have a weights and a bias tensor of some layer:

w, b = layer.trainable_weights()

Then you can regularize this layer by adding the regularization function a loss term to the model object as follows:

def l1_special_reg(weight_matrix, bias_vector):
    return 0.01 * K.sum(K.abs(weight_matrix)-K.abs(bias_vector))

model.add_loss(l1_special_reg(w, b))

Naturally, you can do this for each layer independently.

Bas Krahmer
  • 489
  • 5
  • 11