i want to ask how to implement cross entropy loss for single batch in neural network, where the equation is:
this is my code for cross entropy only for single example:
def softmax_cross_entropy(y_true, y_pred):
softmax_cross_entropy_loss_single = - np.sum([y * np.log(x) for x, y in zip(y_pred, y_true)])
softmax_cross_entropy_grad = y_pred - y_true
return softmax_cross_entropy_loss, softmax_cross_entropy_grad
how to implement this for one batch (using the equation above)?