2

I am having problem in increasing the accuracy of my Feed-Forward Neural network coded in python. I am not sure whether it's a genuine bug or just an incapability of my math functions but I am getting ambiguous outputs (like 0.5) No matter how much I increase the iterations....my code:-

from numpy import exp, array, random, dot

class NeuralNetwork():

    def __init__(self):
        random.seed(1)
        self.synaptic_weights = 2 * random.random((3, 1)) - 1     # MM reuslt = 3 (3 * 1)

    def Sigmoid(self, x):
        return 1 / (1 + exp(-x))

    def Sigmoid_Derivative(self, x):
        return x * (1 - x)

    def train(self, Training_inputs, Training_outputs, iterations):
        output = self.think(Training_inputs)
        print ("THe outputs are: -", output)
        erorr = Training_outputs - output

        adjustment = dot(Training_inputs.T, erorr * self.Sigmoid_Derivative(output))
        print ("The adjustments are:-", adjustment)
        self.synaptic_weights += output

    def think(self, inputs):
        Training_inputs = array(inputs)
        return self.Sigmoid(dot(inputs, self.synaptic_weights))

# phew! the class ends..

if __name__ == "__main__":

    neural_network = NeuralNetwork()
    print("Random startin weights", neural_network.synaptic_weights)

    Training_inputs = array([[1, 1, 1], 
                             [0, 0, 0], 
                             [1, 0, 1],])                 # 3 rows * 3 columns???

    Training_outputs = array([[1, 1, 0]]).T

    neural_network.train(Training_inputs, Training_outputs, 0)

    print ("New synaptic weights after training: ")
    print (neural_network.synaptic_weights)

    # Test the neural network with a new situation.
    print ("Considering new situation [1, 0, 0] -> ?: ")
    print (neural_network.think(array([1, 0, 0])))

While these are my outputs:=>

[Running] python -u "/home/neel/Documents/VS-Code_Projects/Machine_Lrn(PY)/test.py"
Random startin weights [[-0.16595599]
 [ 0.44064899]
 [-0.99977125]]
THe outputs are: - [[0.3262757 ]
 [0.5       ]
 [0.23762817]]
The adjustments are:- [[0.10504902]
 [0.14809799]
 [0.10504902]]
New synaptic weights after training: 
[[ 0.16031971]
 [ 0.94064899]
 [-0.76214308]]
Considering new situation [1, 0, 0] -> ?: 
[0.5399943]

[Done] exited with code=0 in 0.348 seconds

[Running] python -u "/home/neel/Documents/VS-Code_Projects/Machine_Lrn(PY)/tempCodeRunnerFile.py"
Random startin weights [[-0.16595599]
 [ 0.44064899]
 [-0.99977125]]
THe outputs are: - [[0.3262757 ]
 [0.5       ]
 [0.23762817]]
The adjustments are:- [[0.10504902]
 [0.14809799]
 [0.10504902]]
New synaptic weights after training: 
[[ 0.16031971]
 [ 0.94064899]
 [-0.76214308]]
Considering new situation [1, 0, 0] -> ?: 
[0.5399943]

[Done] exited with code=0 in 3.985 seconds

I have tried changing the iterations but the difference is very minor. I think the problem might be in one of my math(Sigmoid) functions. Other than that I think the dot multiplication at line 20 maybe a problem 'coz the adjustments look shifty to me....

Also, doesn't the 0.5 indicate that my network isn't learning as in it is just making a random guess?

P.S:- I think my problem isn't a duplicate one as it deals with the 'accuracy' of said model, while the question linked deals with the 'unwanted outputs'

desertnaut
  • 57,590
  • 26
  • 140
  • 166
neel g
  • 1,138
  • 1
  • 11
  • 25
  • I'm assuming this is just an exercise to familiarize yourself with feed-forward neural networks, but I'm putting this here just in case. Check out [Tensorflow](https://www.tensorflow.org/) and [Keras](https://keras.io/) for libraries that do the heavy lifting for you and make training neural networks much easier. – Engineero Sep 25 '19 at 15:49

1 Answers1

3

Your Sigmoid_Derivative function is wrong, something that has already been pointed out in a previous question of yours; it should be:

def Sigmoid_Derivative(self, x):
    return self.Sigmoid(x) * (1-self.Sigmoid(x))

See the Derivative of sigmoid function thread at Math.SE, as well as the discussion here.

If correcting this still does not give the expected results, please do not alter the question above - instead, open a new one...

talonmies
  • 70,661
  • 34
  • 192
  • 269
desertnaut
  • 57,590
  • 26
  • 140
  • 166
  • Thanx for the sympathies... Also I would like to update that the Sigmoid change did nothing for the accuracy. My post was regarding the accuracy of the model, but Prune has pointed it as a 'duplicate' question.... – neel g Sep 25 '19 at 16:39
  • 1
    @neelg As alredy advised pls open a new question, BUT keep in mind: 1) you really train with only *three (3)* samples? 2) the accuracy calculation (or the result) is nowhere shown in your code - please see how to create a [MCVE] (emphasis on *minimal*...) – desertnaut Sep 25 '19 at 16:45