2

I'm trying to implement something like this https://www.youtube.com/watch?v=Fp9kzoAxsA4 which is a GANN (Genetic Algorithm Neural Network) using DL4J library.

Genetic learning variables:

  • Genes: Creature Neural Network weights
  • Fitness: Total distance moved.

Neural network layers for every creature:

  • input layer: 5 sensors that either 1 if there's a wall in the sensor direction or 0 if not.enter image description here
  • output layer: Linear output that maps to the angle of the creature.

This is my createBrain method for the creature object:

private void createBrain() {
    Layer inputLayer = new DenseLayer.Builder()
            // 5 eye sensors
            .nIn(5)
            .nOut(5)
            // How do I initialize custom weights using creature genes (this.genes)?
            // .weightInit(WeightInit.ZERO)
            .activation(Activation.RELU)
            .build();

    Layer outputLayer = new OutputLayer.Builder()
            .nIn(5)
            .nOut(1)
            .activation(Activation.IDENTITY)
            .lossFunction(LossFunctions.LossFunction.MSE)
            .build();

    MultiLayerConfiguration conf = new NeuralNetConfiguration.Builder()
            .seed(6)
            .optimizationAlgo(OptimizationAlgorithm.STOCHASTIC_GRADIENT_DESCENT)
            .iterations(1)
            .learningRate(0.006)
            .updater(Updater.NESTEROVS).momentum(0.9)
            .list()
            .layer(0,inputLayer)
            .layer(1, outputLayer)
            .pretrain(false).backprop(true)
            .build();

    this.brain = new MultiLayerNetwork(conf);
    this.brain.init();
}

If it might help I have pushed to this repo https://github.com/kareem3d/GeneticNeuralNetwork

And this is the Creature class https://github.com/kareem3d/GeneticNeuralNetwork/blob/master/src/main/java/com/mycompany/gaan/Creature.java

I'm a machine learning student so if you see any obvious mistakes please let me know, thanks :)

Kareem Mohamed
  • 251
  • 3
  • 14

2 Answers2

3

I don't know if you can set weights in the layer configuration(I couldn't see in the API docs) but you can get and set to network parameters after initializing model.

To set them individually for layers you may follow this example;

    Iterator paramap_iterator = convolutionalEncoder.paramTable().entrySet().iterator();

    while(paramap_iterator.hasNext()) {
        Map.Entry<String, INDArray> me = (Map.Entry<String, INDArray>) paramap_iterator.next();
        System.out.println(me.getKey());//print key
        System.out.println(Arrays.toString(me.getValue().shape()));//print shape of INDArray
        convolutionalEncoder.setParam(me.getKey(), Nd4j.rand(me.getValue().shape()));//set some random values
    }

If you want set all parameters of network at once you can use setParams() and params(), for example;

INDArray all_params = convolutionalEncoder.params();
convolutionalEncoder.setParams(Nd4j.rand(all_params.shape()));//set random values with the same shape

You can check API for more information; https://deeplearning4j.org/doc/org/deeplearning4j/nn/api/Model.html#params--

1

It worked for me:

    int inputNum = 4;
    int outputNum = 3;

    MultiLayerConfiguration conf = new NeuralNetConfiguration.Builder()
            .seed(123)
            .layer(new EmbeddingLayer.Builder()
                    .nIn(inputNum) // Number of input datapoints.
                    .nOut(8) // Number of output datapoints.
                    .activation(Activation.RELU) // Activation function.
                    .weightInit(WeightInit.XAVIER) // Weight initialization.
                    .build())
            .list()
            .layer(new DenseLayer.Builder()
                    .nIn(inputNum) // Number of input datapoints.
                    .nOut(8) // Number of output datapoints.
                    .activation(Activation.RELU) // Activation function.
                    .weightInit(WeightInit.XAVIER) // Weight initialization.
                    .build())
            .layer(new OutputLayer.Builder(LossFunctions.LossFunction.NEGATIVELOGLIKELIHOOD)
                    .nIn(8)
                    .nOut(outputNum)
                    .activation(Activation.SOFTMAX)
                    .weightInit(WeightInit.XAVIER)
                    .build())
            .pretrain(false).backprop(false)
            .build();

    MultiLayerNetwork multiLayerNetwork = new MultiLayerNetwork(conf);
    multiLayerNetwork.init();

    Map<String, INDArray> paramTable = multiLayerNetwork.paramTable();
    Set<String> keys = paramTable.keySet();
    Iterator<String> it = keys.iterator();

    while (it.hasNext()) {
        String key = it.next();
        INDArray values = paramTable.get(key);
        System.out.print(key+" ");//print keys
        System.out.println(Arrays.toString(values.shape()));//print shape of INDArray
        System.out.println(values);
        multiLayerNetwork.setParam(key, Nd4j.rand(values.shape()));//set some random values
    }
Alksentrs
  • 21
  • 1