Hope you all to be well. I have two questions. 1) in my deep network, my desired target output is [1,0] for class1 and [0,1] for class2. However after thousands of epochs (2000, 3000) it comes to MSE of 0.234 optimal and than it almost stays there after exceeding 3000 epochs. the final output neurons stays at [0.498,0.5123] for both the classes. and above that it can not go towards [1,0] for class 1 and [0,1] for class 2. What should i do to improve the training result. 2) i used random weights between -2 and 2, -0.2 and 0.2, and some other fixed weights assigned by me manually. But even than it results and stops at almost same situation. Any suggestion in order to improve my results... Thank you in advance....
Asked
Active
Viewed 71 times
0
-
What is the input to your network? And what optimization process are you using to tune the weights? – lmjohns3 Nov 07 '13 at 20:02
-
multiple frames, 3x19x19. and backpropagation.... – khan Nov 07 '13 at 21:16
-
@lmjohns3 can you answer? – khan May 06 '14 at 19:28
-
If you're using the sigmoid function, make sure to have weights that can go above 1 and below -1. If it still doesn't work, it may be because you NN is not complex enough to resolve the problem, try adding neurons on the hidden layer and/or adding another hidden layer. – FrancoisBaveye Mar 25 '15 at 13:22