1

I'm new in using keras framework. I have read some examples about how to construct deep learning models with the Sequential and Graph classes in keras. However, I see that, independently if I use Sequential or Graph, It is assumed that each node of a layer is fully connected with all the nodes of the other layer, is'nt it?

My doubt is the following, if I want to construct a deep feed forward network that it is not fully connected, for instance the first node of the second layer is not connected to the second node of the third layer...etc, even I want add connections (skip connections) between nodes that belong to non-consecutive layers, How can I implement this in keras?

Oscar

  • You might want to consider posting this on https://datascience.stackexchange.com/ – Ben Jan 05 '18 at 10:07
  • You can also try this masking method: https://stackoverflow.com/questions/50290769/specify-connections-in-nn-in-keras/50292915#50292915 – Daniel Möller May 17 '18 at 19:32

1 Answers1

2

You're looking for the most versatile and standard way of creating models in Keras: the functional API Model.

Skip connections

These are quite easy. Create input tensors. Pass the tensors to layers, get the output tensors. Use concatenation or others operations to join them later: #create an input tensor inputTensor = Input(someInputShape)

#pass the input into the first layer
firstLayerOutput = Dense(n1)(inputTensor)

#pass the output through a second layer 
secondLayerOutput = Dense(n2)(firstLayerOutput)

#get the first output and join with the second output (the first output is skipping the second layer)
skipped = Concatenate()([firstLayerOutput,secondLayerOutput])
finalOutput = Dense(n3)(skipped)

model = Model(inputTensor,finalOutput)

Visually, this creates:

input
  |
Dense1
  |    \
Dense2 |
  |    |
   \  /
  Concat
     |
  Dense3
     |
  Output

Custom node connections

This could be more complicated if you're going to change the regular behavior of the layers. It would require custom layers and custom matrix multiplications, etc.

The suggestion is simply work with many little layers and make as many skip connections as you want. Many parallel layers with 1 node represent well a single layer with many nodes.

So you can, for instance, create lots of Dense(1) layers and treat them as nodes. Then you connect them in any way you like.

Daniel Möller
  • 84,878
  • 18
  • 192
  • 214
  • Thank you for the response. Yes, I think that creating layers with Dense(1) is a possible way to simulate a single hidden unit. I do'nt know if it is the best choice to resolve the problem, but certainly it can help. – Oscar Gabriel Reyes Pupo Jan 05 '18 at 19:07