4

I'd like to regularise the weights of a network with both L1 and L2 regularisation. However, I can't find a way to vary the strength of the regularisations independently. The Keras documentation doesn't provide any information either.

So, is there a way to use different strengths in the l1_l2 regulariser? Or perhaps an alternative method to achieve the same result?

My current model is simply:

stren = 0.001
model = Sequential()
model.add(Dense(64, input_dim=148, activation='relu', kernel_regularizer=reg.l2(stren)))
model.add(Dense(1, activation='sigmoid', kernel_regularizer=reg.l2(stren)))

And I'd like to be able to have something along the lines of:

kernel_regularizer=reg.l1_l2(l1_str, l2_str)
today
  • 32,602
  • 8
  • 95
  • 115
Felix
  • 2,548
  • 19
  • 48

2 Answers2

4

Of course you can vary the strengths of regularizers independently:

from keras import regularizers

regularizers.l1_l2(l1=0.001, l2=0.1) # the strength of l1 is set to 0.001 and l2 to 0.1
today
  • 32,602
  • 8
  • 95
  • 115
  • Thank you very much! It's odd they didn't add a mention of that into the documentation.. – Felix Jun 19 '18 at 20:03
0

Perhaps you could try to customize the regularization according to your loss function and design a user-defined regularization function in the Keras framework. Something like this:

def l1_l2(l1=0.01, l2=0.01):
    return L1L2(l1=l1, l2=l2)

Or use the dropout function between layers such as Dropout(0.2).

today
  • 32,602
  • 8
  • 95
  • 115
Wendong Zheng
  • 276
  • 2
  • 10