I have been using Encog for a while to create a neural network. I think that the default Error Rate (degree to which the output of the neural network differs from the expected output) doesn't suit my approach.
Therefore, I would like to override somehow the error function that the Propagation uses in each iteration. I am using BackPropagation and I think that doing this will improve the performance of the neural network.
Is there any way to do it?