1

I would like to have a layer in a sequential model that has some fixed, non-trainable weights, sparsely distributed inside the layer.

For example: I create a model with few layers

model = Sequential()
model.add(Dense(units=n_nodes, activation=activation, kernel_initializer='glorot_uniform'), input_shape=(n_nodes,))
model.add(Dense(....

then I compile and fit the model and obtain the two layers with the trained weights.

Next, I would like for example to take model.layer[0] modify and fix some of the weights, and then perform a retraining of the network.

The trained layer for example is

a b c
d e f
g h i

and I want it to be like this:

A* b  c
d  e  F*
g  H* I*

with A*, F*, H* and I* the edited weights and set to be non-trainable, so that after another round of training the layer results in something like this

A*  b2  c2
d2  e2  F*
g2  H*  I*

My network is built in Keras, and I did not find a way to do this transformation. Is it even possible? I thought about creating a custom Layer but I can't figure out how to make only some values non-trainable.

JE_Muc
  • 5,403
  • 2
  • 26
  • 41
  • 1
    You could add constraints to the weights, such as [shown here](https://www.tensorflow.org/api_docs/python/tf/keras/constraints/Constraint) – JE_Muc Sep 14 '21 at 15:14
  • @JE_Muc Thank you I will look into constraints, but as for now I was not able to set the single weights to be non_trainable – Artem Glukhov Sep 14 '21 at 15:28
  • 1
    Just make a constraint class and assign fixed values to the non trainable tensor cells, f.i. instead of `return w * tf.cast(tf.math.greater_equal(w, 0.), w.dtype)` in [the example](https://www.tensorflow.org/api_docs/python/tf/keras/constraints/Constraint#expandable-1), try something like `w[0, 0] = A`, `w[1, 2] = F` etc. and then `return w`. – JE_Muc Sep 14 '21 at 15:38

1 Answers1

2

I'm not sure if this will work, but you could add custom constraints to your layer weights as shown here.

To fix specific weights, use a custom constraint class (thanks @Artem Glukhov for mentioning tf.keras.backend.set_value):

class FixWeights(tf.keras.constraints.Constraint):

    def __call__(self, w):
        tf.keras.backend.set_value(w[0, 0], A)
        tf.keras.backend.set_value(w[1, 2], F)
        tf.keras.backend.set_value(w[2, 1], H)
        tf.keras.backend.set_value(w[2, 2], I)
 
        return w

and add this to your layer with:

tf.keras.layers.Dense(..., kernel_constraint=FixWeights())
JE_Muc
  • 5,403
  • 2
  • 26
  • 41
  • Thank you for the answer, but it results that w[x,y] are 'ResourceVariable' objects and thus they don't support item assignment – Artem Glukhov Sep 15 '21 at 12:13
  • 1
    I solved by using `keras.backend.set_value(w[0,0], a)` and returning w as you did. Thank you very much! – Artem Glukhov Sep 15 '21 at 15:07
  • 1
    You are welcome and thanks from my side for pointing out the correct way to assign values to a weight tensor! I added this to my answer. – JE_Muc Sep 16 '21 at 13:25