0

The code below comes from https://deeplearning4j.org. I don't quite get the nIn and nOut params. Does the definition below create 2 layers, or 3 with one hidden layer of 1.000 neurons? And what would happen if the nOut of layer 0 would not match nIn of layer 1? Does this always have to be the same number (in this case 1.000)?

.layer(0, new DenseLayer.Builder()
            .nIn(numRows * numColumns) // Number of input datapoints.
            .nOut(1000) // Number of output datapoints.
            .activation("relu") // Activation function.
            .weightInit(WeightInit.XAVIER) // Weight initialization.
            .build())
    .layer(1, new OutputLayer.Builder(LossFunction.NEGATIVELOGLIKELIHOOD)
            .nIn(1000)
            .nOut(outputNum)
            .activation("softmax")
            .weightInit(WeightInit.XAVIER)
            .build())
    .pretrain(false).backprop(true)
    .build();
Thorin
  • 51
  • 1
  • 3

1 Answers1

0

This is standard neural networks. I would suggest reading background material if you are new to neural networks in general.

Dense layers are your standard neural network defining the number of inputs and outputs of the various hidden layers.

Adam Gibson
  • 3,055
  • 1
  • 10
  • 12
  • 1
    I am relatively new to NN's, but did read a bit about it. I thought that in a dense network all neurons are connected to all neurons in the next layer, so the number of outputs of layer 1 would be exactly the same as the number of inputs of layer 2, and therefore it would suffice just to configure the number of neurons per layer. That's why I don't understand having to define input and output for each layer. – Thorin Feb 04 '18 at 05:58
  • I am going to answer only this, after this please read proper reference materials. Our javadoc, nor any deep learning software package you use will teach you *everything* conceptual about DL. The number of connections from in to out is number of inputs * number of outputs. You need to know the number of inputs on the first layer. Afterwards, you can just define the number of outputs or in your words "neurons". Neural networrks are way more complex than this though (CNNs,RNNs, other architectures,..), and our configuration generalizes to those architectures as well (not just dense layers). – Adam Gibson Feb 04 '18 at 07:54