1

I am new to Artificial neural network but please help me with this question?

I am trying to implement an artificial neural network for character recognition (using MLP, and SNN), do I need to have same number of neurons in the output layer as the numbers of characters required to be identified. For example do I need to have 26+26+10 neurons in output layer if I want my network to be able to identify capital letters, small letters and digits.

what if I had to identify all the characters in the Unicode character set, how many neurons in the output layer will i need.

Are there any methods(dynamic threshold) to reduce this number or to dynamically add neurons in output layer?

Please provide links to research articles if possible. Thanks.

Community
  • 1
  • 1
rkrara
  • 442
  • 1
  • 6
  • 17
  • This research article provides a way of character recognition using neural networks where you need not particularly have the same number of outputs neurons as the number of outputs: http://yann.lecun.com/exdb/publis/pdf/lecun-98.pdf ... It basically provides a way where, by a combination of activations of the output neurons, you will be able to predict the output. – StrikeR Feb 14 '14 at 09:23

1 Answers1

5

No, you don't need the outputlayer size to match the number of classes.

I think what you need to understand yet is that the output layer's output is simply a representation of the network's input. That being said, you can have any output layer you want. If you want an encoding that mirrors your classes, the easiest way to reduce the number of nodes in the layer would be to use binary encoding.

Example: Instead of using 8 Nodes for 8 Classes (1 node per class), you can use 3 neurons:
Class 0 is the output 0-0-0
Class 1 is the output 0-0-1
...
Class 7 is the output 1-1-1

I think you get the idea. Of course you can use not only binary but literally any encoding method you can think of (or google).

runDOSrun
  • 10,359
  • 7
  • 47
  • 57
  • thanks, i got it. This is what I was expecting. I am surprised why I could not think about it. – rkrara Feb 18 '14 at 07:18
  • binary encoding assumes ordinality in class labels, which probably doesn't exist. – gunes Feb 19 '19 at 21:41
  • @gunes Actually, it assumes nominal labels. A normal MLP wouldn't naturally be treating "100" as the continuation of "011" so the specific encoding doesn't need to be ordinal. Anyway, this type of encoding has gone out of fashion since I originally posted this. Nowadays it's all one-hot encodings, softmax or otherwise non-binary output encodings. – runDOSrun Mar 18 '19 at 13:12