1

I want to know how to make an auto-tagging for images.

I have tried using Tensorflow and pre-trained the models several times. For start, it was quite good for classification.

But now, I need to do auto-tagging.

Using tensorflow the prediction sum result will be always 1.

For example something like this :

xxx.jpg prediction result :

  • Cat = 0.822
  • Dog = 0.177
  • Deer = 0.001

The sum will be always 1.

What I wanted is something like this :

xxx.jpg prediction result :

  • Cat = 0.901
  • Dog = 0.811
  • Deer = 0.991

Because there might be Cat, Dog and Deer in the same picture in xxx.jpg (just like Clarifai did.)

I wonder what is the basic concept to achieved that?

Thank you.

Ian Boyd
  • 246,734
  • 253
  • 869
  • 1,219
Lyn
  • 507
  • 3
  • 15

1 Answers1

0

Take a look at the last layer you made. As you say the sum of your predictions is always one, this sounds like you applied softmax (https://en.wikipedia.org/wiki/Softmax_function). If you remove this, you get activations for every object.

Please let me know if this helped you!

rmeertens
  • 4,383
  • 3
  • 17
  • 42
  • 1
    Actually, I used the tutorial in here https://www.tensorflow.org/versions/r0.11/how_tos/image_retraining/. So, I have not made my own code for the training. Maybe, it's time to do that now. I'll try it out and see if your answer is really working ! Thank you ! – Lyn May 03 '17 at 09:21
  • 1
    Hi again ! I already try to read & decided to edit the retrain.py from the tutorial link I mentioned earlier. Since It is simplified things a lot for me. I tried to erase the softmax layer, which defined in this line from the retrain.py :`final_tensor = tf.nn.softmax(logits, name=final_tensor_name)`. But, If I just erased it, it won't work because it's missing the final_tensor_name later. So, I just change it to this `final_tensor = tf.nn.relu(logits, name=final_tensor_name)`. But the result is, the value now can get past 1. For example : cat = 1.345 - dog = 0.234543 - deer = 0. – Lyn May 05 '17 at 07:05
  • 1
    Anyway, I used relu instead after reading from this [http://stackoverflow.com/questions/42697341/how-to-use-softmax-activation-function-at-the-output-layer-but-relus-in-the-mid] – Lyn May 05 '17 at 07:08
  • 2
    So, the solution was to change the Softmax layer to Sigmoid Layer, not just remove it. – Lyn May 09 '17 at 04:07