0

I am implementing a neural network and using a cross-entropy loss function. Cross entropy error is given by:

error = - np.sum((actual_Y * np.log2 (Y_pred)) + ((1-actual_Y)* np.log2 (1 - Y_pred)))

after few iterations (1- Y_pred) inside np.log2()start returning -inf.

why it happens and what is its solution? It is obvious log2(0) is -inf but how to overcome it.

Irfan Umar
  • 196
  • 3
  • 18
  • Could you kindly provide the values for y_pred and actual_y for the first (until inf is reached) iterations (i.e. every time loss is calculated)? – alan.elkin Feb 29 '20 at 21:04
  • y_pred is approximately 1 and actual_y is 1 too. and iterations (i.e. every time loss calculated, i.e. number of epochs) – Irfan Umar Feb 29 '20 at 21:17

0 Answers0