0

I have a neural network with a lot of inputs, and i want to train it to realise that only 1 of the inputs matter. First i train it with input[1]=1 and given result 10 then i train with exact same inputs except input[1] = 0 and given result being 0.

I train them until the error is 0 before i switch to the other one, but they just keep changing different weights up and down till the output is equal to the given result, they never figure out that only the weights related to input[1] needs to be concerned about. Is this a common error so to say, that can be bypassed somehow?

Ps. I'm using Sigmoid and derivatives

humudu
  • 699
  • 1
  • 7
  • 13
  • First this is too broad, second I think you need to train the network with all the different samples at the same time, different sample on each pass. Then at the end you save the weights which is basically the best weights with the best result that satisfies all samples – Khalil Khalaf Jun 09 '16 at 14:00

1 Answers1

1

what you are doing is incremental or selective learning. each time you re-train the network on a new data several epochs you are over fitting the new data. if in your case you don't care about the incremental learning and you just care about the result from your data set it is better you use batches from you data set over several iteration until your network converge and doesn't fit the training data.

Feras
  • 834
  • 7
  • 18
  • The incremental learning is the only thing that really matters in this case, do you know how I can achieve this? I only receive one set of inputs per day, and it's supposed to learn from running them individually. It's my exam project and i'm kinda panicking now, any suggestions will be appreciated – humudu Jun 10 '16 at 10:36
  • they are many papers talking about this. I'm sure by readinng some you'll have the best intuition to your question – Feras Jun 10 '16 at 13:28