1

I want to know how can I use the entropy instead of summation in Tensorflow. I have a code for object classification and I use summation operation to sum two Tensors. For example::

layer1 = tf.nn.conv2d(inp, [1,1,3,32], [1,1,1,1], 'SAME')
layer2 = tf.nn.conv2d(inp, [3,3,3,32], [1,1,1,1], 'SAME')
res = tf.add(layer1, layer2)

I want to replace the summation (tf.add) with entropy. In other words, I want to combine theses to inputs using entropy.

How can I do this in Tensorflow or in python.

There is no problem if anyone gives me just the equation to do this. Also, is this called shannon entropy or it is different!!??

EDIT:

I tried to to use something approximate of this paper but in my issue. they used multi-sources and entropy. I could not understand the equation very well so I have some misunderstanding

programmer
  • 577
  • 1
  • 9
  • 21
  • entropy is quantity of probability distribution. It is not an "alternative" to adding two signals. please either expand on what you are trying to achieve, or where does the idea of "entropy" in this use case come from – lejlot Aug 03 '18 at 22:55
  • please can you check up the edit. I mentioned the idea source – programmer Aug 05 '18 at 14:11

0 Answers0