0

I am trying to perform multi-class text classification using the deep recurrent neural network. My network is incurring a huge loss of 94%, 80% and sometimes 100% with certain accuracy. It is surprising that with 64% validation accuracy the incurred loss is 96%. I want to comprehend that whether the incurred loss has direct relation to accuracy or accuracy is being calculated on correctly acquired data. I am using the categorical crossentroy function to estimate the loss.

model.compile(optimizer=Adam(learning_rate=0.001), loss='categorical_crossentropy', metrics=['accuracy'])

print('Train...')
early_stopping = EarlyStopping(monitor='val_accuracy', patience=3, mode='max')
model.fit(x_train, y_train,
          batch_size=32,
          epochs=10,
          callbacks=[early_stopping],
          validation_data=(x_test, y_test))
Bilal Chandio
  • 89
  • 2
  • 9
  • There is no "%" in the cross-entropy loss. If you get a loss of 0.94, that is a cross-entropy of 0.94, simple as that, not a "94% loss". – xdurch0 Nov 10 '20 at 18:25
  • @xdurch0 Thanks for your kind reply and correction. If loss is not a percentage thing than what could be the maximum value of a loss function. Consider a ternary class. – Bilal Chandio Nov 10 '20 at 18:37
  • Cross-entropy uses log probabilities and can in theory be infinitely high, so there is no maximum. However, the realistic worst case would be random guessing, which would result in a loss of log(n) with n classes. So with 3 classes that would be about 1.10. – xdurch0 Nov 10 '20 at 21:03

1 Answers1

0

The answer is no:

Loss is to be seen as the distance between the true values of the problem and the values predicted by the model. The greater the loss is, the larger the errors you made on the data.

Accuracy should be regarded as the number of error you made on the data. A low accuracy and a large loss means you made large errors on a large amount of data, while a low accuracy and low loss implies a few errors on a large amount of data. In the same way, a high accuracy together with low loss is to be understood as the fact that you made few errors on little data. this is in fact what you arre striving for.

Observe that accuracy is a percentage, while loss is not.

  • I appreciate your answer, but I want to understand the relation of loss with accuracy. Whether, they have any *direct relation* or not? Because when we have say 65% validation accuracy and 94% loss how do interpret the relation of these two states. – Bilal Chandio Nov 10 '20 at 15:24
  • This was my point...althought they often appear to be inversely proportional there is NO mathematical relationship between them. And that they appear to be inversely proportional is not always the case. Accuracy and loss have different definitions and measure different things. – Serge de Gosson de Varennes Nov 10 '20 at 15:49
  • I came to know that loss isn't measured in terms of percentage. For instance, if we have a loss of 1.15 then it wouldn't be 115%. In such case what would be the maximum value of cross-entropy loss which depicts the error rate. – Bilal Chandio Nov 11 '20 at 06:44