1

I need to design a multilayer perceptron with two hidden layers and one output layer. The weight matrix between the second hidden layer and the output layer contains some non-trainable weights which are fixed to 0 based on theory. Is there a way to do this in Keras?

user5016984
  • 121
  • 1
  • 5
  • You can freeze the weights of a layer: `layer.trainable = False`. Please do some research before asking and look at the [documentation](https://keras.io/getting-started/faq/#how-can-i-freeze-keras-layers). – today May 14 '18 at 19:54
  • 1
    I in fact researched about layer.trainable = False, but that is not what I want. My understanding is that it freezes the entire layer. I want to only freezes specific weights in one layer, while other weights remain free. – user5016984 May 14 '18 at 22:54

0 Answers0