What is the difference between losses/clone_0/softmax_cross_entropy_loss and losses/clone_0/aux_loss/value in inception-v4? Currently, I'm training a large-scale model using tf-slim and inception-v4 network on 4 GPUs(--num_clones=4 ). but these two charts are completely different. after 190K steps with batch-size=128, I get these charts:Losses
as you can see in the image total loss and have a similar trend. but the softmax_cross_entropy have a completely different procedure! Which one of these losses can describe the training procedure better?