During last few days I have started working with deeplearning4j library and I have encountered a kind of a problem.
My testing and input data consist of 25 binary values. Training set contains 40 rows. Network has 4 output values. My goal is to train the network to have as little error as possible.
I have tried different configurations (also the ones that were presented in deeplearning4j examples) but still I can not configure my network to have accuracy satisfactory level. What is more classification is really odd - for instance output values of network are like [0.31, 0.12, 0.24, 0.33].
To my mind proper values should be like [0, 0, 0, 1] etc.
My Neural network configuration:
private static final int SEED = 123;
private static final int ITERATIONS = 1;
private static final int NUMBER_OF_INPUT_NODES = 25;
private static final int NUMBER_OF_OUTPUT_NODES = 4;
private static final int EPOCHS = 10;
public static MultiLayerNetwork getNeuralNetwork() {
StatsStorage storage = configureUI();
MultiLayerConfiguration conf = new NeuralNetConfiguration.Builder().seed(SEED).iterations(ITERATIONS).learningRate(1e-1)
.optimizationAlgo(OptimizationAlgorithm.STOCHASTIC_GRADIENT_DESCENT)
.weightInit(WeightInit.RELU).updater(Updater.ADADELTA).list()
.layer(0, new DenseLayer.Builder().nIn(NUMBER_OF_INPUT_NODES).nOut(60)
.activation(Activation.RELU).build())
.layer(1, new DenseLayer.Builder().nIn(60).nOut(50)
.activation(Activation.RELU).build())
.layer(2, new DenseLayer.Builder().nIn(50).nOut(50)
.activation(Activation.RELU).build())
.layer(3, new OutputLayer.Builder(LossFunctions.LossFunction.MCXENT).nIn(50).nOut(NUMBER_OF_OUTPUT_NODES)
.activation(Activation.SOFTMAX).build()).backprop(true).build();
MultiLayerNetwork network = new MultiLayerNetwork(conf);
network.init();
network.setListeners(new StatsListener(storage), new ScoreIterationListener(1));
DataSetIterator iterator = new ListDataSetIterator(createTrainingSet());
for (int i = 0; i < EPOCHS; i++) {
network.fit(iterator);
}
return network;
}
I will be really grateful for any help. Regards,