3
public static double testElmanWithAnnealing(NeuralDataSet trainingSet, 
        NeuralDataSet validation,int maxEpoch)
{
    // create an elman network
    ElmanPattern pattern = new ElmanPattern();
    pattern.setActivationFunction(new ActivationTANH());
    pattern.setInputNeurons(trainingSet.getInputSize());
    pattern.addHiddenLayer(8);
    pattern.setOutputNeurons(trainingSet.getIdealSize());
    BasicNetwork network = (BasicNetwork)pattern.generate();
    network.reset();

    // set up a hybrid strategy of resilient + simulated annealing
    CalculateScore score = new TrainingSetScore(trainingSet)
    final MLTrain trainAlt = new NeuralSimulatedAnnealing(
            network, score, 10, 2, 100);
    final MLTrain trainMain = 
            new ResilientPropagation(network, trainingSet);
    trainMain.addStrategy(
            new HybridStrategy(trainAlt,0.00001,100,3));

    int epoch = 0;
    do {
        trainMain.iteration();
        System.out
                .println("Epoch #" + epoch + " Error:" + trainMain.getError());
        epoch++;
    } while(trainMain.getError() > 0.01 && epoch < maxEpoch);

    int trueStuff = 0;
    int falseStuff = 0;
    for(MLDataPair pair: validation ) {
        final MLData output =  network.compute(pair.getInput());
        System.out.println(
                "actual=" + output.getData(0) + ",ideal=" + pair.getIdeal().getData(0));
        if(output.getData(0) * pair.getIdeal().getData(0) > 0)
            trueStuff++;
        else
            falseStuff++;
    }
    System.out.println("true classifications:" + trueStuff);
    System.out.println("false classifications:" + falseStuff);

    return network.calculateError(validation);
}

I have 8 inputs of floating point variables normalized using a simple min/max scheme to values between -1 and 1.

Trying to classify into either a negative value or a positive value (binary classification). So in the training and validation set the ideal would be either 1 or -1.

Network always produces the same result, or it might have one or two results. For example: -0.05686225929855484 around 90% of the time and some other values occasionally.

  1. am I using encog wrong? does anything in the code stand out to you as a bug?
  2. can I do anything to punish such behaviour of the neural network?
  3. this is even worse than a random guess, surely there's a way to get better predictions. Thanks in advance.
Jorn Vernee
  • 31,735
  • 4
  • 76
  • 93
Uvogin
  • 31
  • 2
  • 1
    How do you initialize your weights? – Marcin Możejko May 03 '16 at 20:29
  • What's the class disparity (I.e., the ratio of one class to the other)? If there is a large class imbalance the network may be "learning" to always classify the majority class. If this is the case, techniques like SMOTE can be employed on the training set to overcome the issue. – DMML May 04 '16 at 01:00
  • network.reset(); initializes the weights using Nguyen Window initialization. Classes are roughly of equal ratios, and thus the network output always to centre around 0. The problem here of course is that it outputs the same number most of the time. – Uvogin May 04 '16 at 07:11

0 Answers0