I have a neural network with a lot of inputs, and i want to train it to realise that only 1 of the inputs matter. First i train it with input[1]=1 and given result 10 then i train with exact same inputs except input[1] = 0 and given result being 0.
I train them until the error is 0 before i switch to the other one, but they just keep changing different weights up and down till the output is equal to the given result, they never figure out that only the weights related to input[1] needs to be concerned about. Is this a common error so to say, that can be bypassed somehow?
Ps. I'm using Sigmoid and derivatives