1

I am using custom softmax function. I am trying to use shape of tensor x as an element of shape of new tensor of zeros. It can not be done since it is not int.

def custom_softmax(x):
    sh = K.shape(x)
    ...      
    xc = K.zeros((sh[0] * 16 * 16, 1))
    ...

The next option I tried is with evaluation of tensor which should work but not.

def custom_softmax(x):
    sh = K.shape(x)
    sess = K.get_session()
    ...      
    xc = K.zeros((sh[0].eval(session=sess) * 16 * 16, 1))
    ...

It gives me error

tensorflow.python.framework.errors_impl.InvalidArgumentError: You must feed a value for placeholder tensor 'image_part_input' with dtype float

which is completely not understandable since it reference the main network input to be incorrect. Network works when I hardcode values of shape in K.zeros. Is there any other solution?

Primoz
  • 1,324
  • 2
  • 16
  • 34
  • Can you provide more context? For example, is `sh[0]` the batch size, or is it `None` (which indicates it could be any value), or is it a fixed hyperparameter? – Yu-Yang Aug 02 '17 at 11:33
  • @Yu-Yang `sh` is tensor with four values `[batch_size, height, width, channels]`. It is collected with `sh = K.shape(x)` as you can see in the code in my question. `sh[0]` is in this case int that represent `batch_size`, but is still Keras tensor since `sh` is Keras tensor. – Primoz Aug 02 '17 at 12:08
  • If it's the batch size, then why don't you provide it as an argument to the function? Are you dealing with a problem that requires a non-fixed batch size? – Yu-Yang Aug 02 '17 at 13:48
  • If you're using this custom softmax function as a loss, [this](https://stackoverflow.com/a/45450946/1531463) may help. If you're using it in a `Lambda` layer, you can pass it with something like `Lambda(custom_softmax, arguments={'batch_size': batch_size})`. – Yu-Yang Aug 02 '17 at 13:54
  • @Yu-Yang it was my previous solution, but I am making implementation that has not fixed batch size now. Is there any other solution? – Primoz Aug 02 '17 at 14:31
  • 1
    Although not pure Keras, I think you can try something like `tf.fill(tf.stack([sh[0] * 16 * 16, 1]), 0.0)` (I haven't tested, though) – Yu-Yang Aug 02 '17 at 14:45
  • @Yu-Yang Thank you. It fixed my problem. – Primoz Aug 03 '17 at 06:11

1 Answers1

0

I fixed my problem with Tensorflow instead of Keras as suggested by @Yu-Yang. Instead of K.zeros function I used tf.fill function that can accept tensor as a shape.

def custom_softmax(x):
    sh = K.shape(x)
    ...      
    xc = tf.fill(tf.stack([sh[0] * 16 * 16, 1]), 0.0)
    ...
Primoz
  • 1,324
  • 2
  • 16
  • 34