1

I'm trying to train a sparse model, that is some of the model parameters have to remain zero during optimization.

Is this possible in Keras to define a mask for the parameters so that the optimizer would not update masked ones?

Unfortunately, freezing one layer would not work as I need to mask parameters in a more fine-grained fashion.

soundslikeodd
  • 1,078
  • 3
  • 19
  • 32
soroosh.strife
  • 1,181
  • 4
  • 19
  • 45

1 Answers1

0

You can use tf.where to select elementwise between the parameters and tf.stop_gradient(parameters).

Alexandre Passos
  • 5,186
  • 1
  • 14
  • 19