0

I have done coding for neural network in Python for the multi-layer,feed-forward, back-propagation structure. In this network structure I have 24 nodes in input layer, 18 nodes in hidden layer and 1 node in output layer. I am getting the good training result for small data set, but for the large data input I am not able to set the value of constant parameters like learning rate, momentum rate etc. I have the input sample for the input value as:

[[1,0,1,0,1,0,1,0,1,1,1,1,0,0,0,0,0,1,1,1,0,1,0,1]
 [1,0,1,0,1,0,1,0,1,1,1,1,0,0,0,0,0,1,1,1,0,1,0,1]
 [1,0,1,0,1,0,1,0,1,1,1,1,0,0,0,0,0,1,1,1,0,1,0,1]
 .
 . .........................                     ]

And input sample for the target value as:

[[-20.0]
 [-10.0]
 [30.0]
  .
  .....]

the total number of the sample is around 5000. I have trained this network using

learning_rate = 0.01 
momentum_rate = 0.07

it giving good result but taking so much time and iterations around 500000. Is there any good suggestion for the setting of learning rate and momentum rate so i can get my result fast. Or should i introduce increase learning rate ratio, if i introduce the learning rate then what should be the value?

lkkkk
  • 1,999
  • 4
  • 23
  • 29
  • how do you determine how good your results are? Do you have a validation set? a test set? If you do, and with a little data set you get better results than with plenty of data, there might be something wrong with your data – cruvadom Oct 16 '14 at 23:50

1 Answers1

0

It's likely going to be a process of trial and error in order to achieve faster learning of the neural network.

If you want to obtain a result faster, then it is possible that you could increase the learning rate (larger weight adjustments), drop some hidden layer neurons or inputs (less calculations) or drop the number of iterations, but this may also reduce testing and validation set performance.

If you have time, these options could be tested and compared to determine if faster learning could be accomplished for your problem.

Matthew Spencer
  • 2,265
  • 1
  • 23
  • 28