3

I am trying out something where I required to freeze some selected weights. Take this example

from keras.models import Sequential
from keras.layers import Dense,Input

model = Sequential()
model.add(Dense(4, input_shape=(4,),activation='relu'))
model.add(Dense(3,name="hidden",activation='relu'))
model.add(Dense(2,activation='sigmoid'))
model.compile(loss='mse', optimizer='adam')

print(model.layers[1].get_weights()[0])

This will print the input to hidden layer weights.

# Weights input x hidden
# Freeze 2Rx3C and 4Rx2C
# 2Rx3C=0.14362943; 4Rx2C=-0.23868048
array([[-0.05557871,  0.10941017, -0.59108734],
       [ 0.37056673,  0.2968588 ,  0.14362943],
       [-0.05471832, -0.21425706,  0.6455065 ],
       [-0.7883829 , -0.23868048, -0.517396  ]], dtype=float32)

From this weight matrix I want to freeze values in (2nd Row, 3rd Column) and (4th Row, 2nd Column), that is, 0.14362943 and -0.23868048 respectively. I dont want to update these values on backprop. How can I freeze these selected weights?

Eka
  • 14,170
  • 38
  • 128
  • 212

1 Answers1

1

You need to use tf.identity, it is used when you want to explicitly transport tensor between devices.

matrixVariable = tf.Variable(<your matrix>)
matrixVariableSliced = matrixVariable[<sliced matrix>] #take out your required weights
matrixVariable_stop = tf.stop_gradient(tf.identity(matrixVariableSliced)) 
matrixVariable = tf.concat((matrixVariableSliced, matrixVariable_stop), axis=1)
PC_11
  • 267
  • 1
  • 6