4

I am using Encog library to solve a pattern recognition problem by following the basic example provied by Mr. Jeff Heaton. I have the pattern

1 3 5 4 3 5 4 3 1

which is my ideal pattern with output 1 (which would mean it is 100% the same) Now I want to input another pattern and see how similar it is to the ideal pattern.

This code is used for creating the network

BasicNetwork network = new BasicNetwork();
network.AddLayer(new BasicLayer(new ActivationSigmoid(), true, NumberOfInputNeurons));
network.AddLayer(new BasicLayer(new ActivationSigmoid(), true, 20));
network.AddLayer(new BasicLayer(new ActivationSigmoid(), true, 15));
network.AddLayer(new BasicLayer(new ActivationSigmoid(), true, 1));
network.Structure.FinalizeStructure();
network.Reset();
INeuralDataSet trainingSet = new BasicNeuralDataSet(XOR_INPUT, XOR_IDEAL);

Then, I train the network

do
{
    train.Iteration();
    Console.WriteLine("Epoch #" + epoch + " Error:" + train.Error);
    epoch++;
} while ((epoch <= 20000) && (train.Error > 0.001));

And finally, I print the results:

foreach (INeuralDataPair pair in trainingSet)
{
    INeuralData output = network.Compute(pair.Input);
    Console.WriteLine(pair.Input[0] + "-" + pair.Input[1] + "-" + pair.Input[2] + ....
            + ":   actual = " + output[0] + "   ideal=" + pair.Ideal[0]) ;
}

Back to my question again:

How do I enter another pattern and see if it looks like mine?

Any ideas that may lead me to a solution are welcomed. Thanks

user3666197
  • 1
  • 6
  • 50
  • 92
mkdavor
  • 135
  • 4
  • 16

1 Answers1

1

I am not sure I completely follow this. Do you have more patterns than that? Or is your training set exactly one pattern, and you just want to see how similar other patterns are to it? A neural network does not really compare the degree of similarity between patterns. A neural network is trained to output some vector based on a training set giving inputs and ideal vectors.

If you really do want to just compare "1 3 5 4 3 5 4 3" to another similar vector, I would suggest just using a Euclidean distance, or similar measurement.

If, on the other hand, you DO want to train the neural network to recognize how similar something is to that patter, you will need to generate some more training data. I would generate 1000 or so cases, and generate a euclidean distance between each random case and your vector above, and scale that to a percent. You will also need to normalize your input vector to 0 to 1 for the neural network, for best performance.

Edit:

Here is how I would represent this. You will have a number of input neurons equal to the maximum number of x-axis points you can have. However, you do need to normalize these values, so I would suggest establishing what the max Y would be and normalize to between 0 and that value. Then, for your outputs, you will have one output neuron for every possible letter you can have. Perhaps the first output neuron is A, the second B. Then use one-of-n encoding and set the only ONE of the output neurons to 1 and the rest to zero:

[input pattern for A] -> [1 0 0]
[input pattern for B] -> [0 1 0]
[input pattern for C] -> [0 0 1]
[another input pattern for C] -> [0 0 1]

Use the above as your training set. Of course, if you have all 26 letters, then you have 26 outputs.

JeffHeaton
  • 3,250
  • 1
  • 22
  • 33
  • 1
    Thanks for your answer Mr. Jeff Heaton. Let me make my question more clear. What I have is 6 patterns (like the one I provided, but a bit longer[15-20 digits]). Those patterns represent the y-axis value from a graph. Since x-axis is always increasing with step 1, I don't use those points. The goal (for the 1st pattern) is to see how similar the pattern is to the letter "M". so 1 3 5 4 3 5 4 3 are actually points (1,1) (3,2) (5,3) (4, 4) (3,5)... and so on, which on the graph look like the letter M. I thought that this problem can be solved like pattern recognition problem using Encog, or not? – mkdavor Oct 07 '14 at 08:39
  • 1
    This works much, much faster then the solution I wanted to make. And it works very accurate. Thank you a lot! – mkdavor Oct 09 '14 at 10:01