0

When doing multi-classcification usually I got a softmax score and predictoins with below,

softmax_scores = tf.nn.softmax(logits=self.scores, dim=-1)
prediction=tf.argmax(self.scores, 1, name="predictions")

If the softmax_socres I got is [0.5,0.2,0.3].The prediction is [0]. Now I want to add thresholds 0.6 to softmax_socres.Which means the prediction expected here is [4] which means others. I did as below

threshold=0.6
self.predictions = tf.argmax(self.scores, 1, name="predictions")
x = tf.constant([num_classes], shape=self.predictions.shape, dtype=tf.int64)
self.predictions1 =tf.where(tf.reduce_max(tf.nn.softmax(logits=self.scores, dim=-1),1)>=threshold,self.predictions,x)

And got:

File "E:\ai\wide-and-shallow cnn\text_cnn.py", line 102, in __init__
    x = tf.constant([num_classes], shape=self.predictions.shape, dtype=tf.int64)
  File "E:\Python\Python36\lib\site-packages\tensorflow\python\framework\constant_op.py", line 214, in constant
    value, dtype=dtype, shape=shape, verify_shape=verify_shape))
  File "E:\Python\Python36\lib\site-packages\tensorflow\python\framework\tensor_util.py", line 430, in make_tensor_proto
    if shape is not None and np.prod(shape, dtype=np.int64) == 0:
  File "E:\Python\Python36\lib\site-packages\numpy\core\fromnumeric.py", line 2566, in prod
    out=out, **kwargs)
  File "E:\Python\Python36\lib\site-packages\numpy\core\_methods.py", line 35, in _prod
    return umr_prod(a, axis, dtype, out, keepdims)
TypeError: __int__ returned non-int (type NoneType)

It worked in this demo.

import tensorflow as tf
import numpy as np
a=tf.constant(np.arange(6),shape=(3,2))
b=tf.reduce_max(a,1)
#c=tf.to_int32(a>3)
c=tf.argmax(a,1)
d=b>=3
f=tf.constant([5],shape=c.shape,dtype=tf.int64)
e=tf.where(d,c,f)
with tf.Session() as sess:
    sess.run(tf.global_variables_initializer())
    print(a.eval(),b.eval(),c.eval(),d.eval(),f.eval(),e.eval())
Yuguang
  • 75
  • 1
  • 11

1 Answers1

1

How about doing it this way, with the use of tf.where

threshold = 0.6

softmax_scores = tf.nn.softmax(logits=self.scores, dim=-1)

other_class_idx = tf.cast(tf.shape(softmax_scores)[0] + 1, tf.int64)
other_class_idx = tf.tile( \
    tf.expand_dims(other_class_idx, 0), \
    [tf.shape(softmax_scores)[0]] \
)

is_other = tf.reduce_max(tf.cast(softmax_scores > threshold, tf.int8), axis=1)

predictions = tf.where( \
                 is_other>0, \
                 tf.argmax(softmax_scores, 1), \
                 other_class_idx \
              ) # 4
syltruong
  • 2,563
  • 20
  • 33
  • The demo worked, but not working in the project. ValueError: Shape must be rank 0 but is rank 2 for 'output/cond/Switch' (op: 'Switch') with input shapes: [?,13], [?,13]. – Yuguang Jun 13 '18 at 09:59
  • I assume the `softmax_scores` tensor is of shape [?, num_classes] then. I edited the answer to reflect that. Note that we are now using `tf.where` instead of `tf.cond` – syltruong Jun 13 '18 at 11:33
  • Thank you! It works only without lambda in tf.where. – Yuguang Jun 14 '18 at 01:47