I am implementing a neural network and using a cross-entropy loss function. Cross entropy error is given by:
error = - np.sum((actual_Y * np.log2 (Y_pred)) + ((1-actual_Y)* np.log2 (1 - Y_pred)))
after few iterations (1- Y_pred)
inside np.log2()
start returning -inf
.
why it happens and what is its solution?
It is obvious log2(0)
is -inf
but how to overcome it.