0

I'm trying to get started with neural network and implement boolean functions like AND/OR. Instead of using 0, and 1 as binary inputs, they use -1 and +1. Is there a reason why we cannot use (0, 1)? As an example: http://www.youtube.com/watch?v=Ih5Mr93E-2c

A. K.
  • 34,395
  • 15
  • 52
  • 89
  • 1
    see here http://www.faqs.org/faqs/ai-faq/neural-nets/part2/, search for `Subject: Why not code binary inputs as 0 and 1? ` – Zaw Lin Oct 04 '13 at 19:01

2 Answers2

2

If you really mean inputs, there is no restriction on using {-1,1}. You can just as easily use {0,1} or any other pair of real numbers (e.g., {6,42}) to define your True/False input values.

What may be confusing you in the charts is that {-1,1} are used as the outputs of the neurons. The reason for that, as @Memming stated, is because of the activation function used for the neuron. If tanh is used for the activation function, the neuron's output will be in the range (-1,1), whereas if you use a logistic function, its output will be in the range (0,1). Either will work for a multi-layer perceptron - you just have to define your target value (expected output) accordingly.

bogatron
  • 18,639
  • 6
  • 53
  • 47
1

In most cases there's no difference. Just use logistic function activation instead of tanh. In some special forms, e.g., Ising model, it could nontrivially change the parameter space though.

Memming
  • 1,731
  • 13
  • 25