3

I am considering creating a customized neural network. The basic structure is the same as usual, but I want to truncate the connections between layers. For example, if I construct a network with two hidden layers, I would like to delete some weights and keep the others, like so:

enter image description here

This is not conventional dropout (to avoid overfitting), since the remaining weights (connections) should be specified and fixed.

Are there any ways in python to do it? Tensorflow, pytorch, theano or any other modules?

iacob
  • 20,084
  • 6
  • 92
  • 119
Yuan
  • 65
  • 7
  • Yes, you can achieve this in PyTorch by [multiplying your weights by a mask](https://stackoverflow.com/questions/53544901/how-to-mask-weights-in-pytorch-weight-parameters). – iacob Mar 12 '21 at 13:08
  • Does this answer your question? [How to mask weights in PyTorch weight parameters?](https://stackoverflow.com/questions/53544901/how-to-mask-weights-in-pytorch-weight-parameters) – iacob Mar 12 '21 at 13:09

1 Answers1

0

Yes you can do this in tensorflow.

You would have some layer in your tensorflow code something like so:

m = tf.Variable( [width,height] , dtype=tf.float32  ))
b = tf.Variable( [height] , dtype=tf.float32  ))
h = tf.sigmoid( tf.matmul( x,m ) + b )

What you want is some new matrix, let's call it k for kill. It is going to kill specific neural connections. The neural connections are defined in m. This would be your new configuration

k = tf.Constant( kill_matrix , dtype=tf.float32 )
m = tf.Variable( [width,height] , dtype=tf.float32  )
b = tf.Variable( [height] , dtype=tf.float32  )
h = tf.sigmoid( tf.matmul( x, tf.multiply(m,k) ) + b )

Your kill_matrix is a matrix of 1's and 0's. Insert a 1 for every neural connection you want to keep and a 0 for every one you want to kill.

Anton Codes
  • 3,663
  • 1
  • 19
  • 28
  • Thanks. The kill matrix looks good. It inspires me. I will try it. – Yuan May 10 '17 at 15:20
  • may I ask another question? Will this modification affect optimizer, like Adam? As my understanding, we use backpropagation to update the weights. Now we truncate the connection, will the backpropagation update the killed weights, which has no effect on feeding forward process. Or I can interpret this operation as the updating will happen but no effect on feeding forward processing. – Yuan May 10 '17 at 15:29
  • Next, maybe the *kill layer* should instead be called a *gate layer* or a *suppress layer* for clarity. To answer your question about Adam (or any other optimizer), the answer is **this will not negatively affect the optimizer**. The *suppress layer* will both suppress the feedforward and backpropagation step using the suppressed (or **killed**) connections. – Anton Codes May 10 '17 at 15:39