0

i keep seeing some layers in caffe that has two bottoms(source) or tops(destination) in Neural networks architecture,for example this is Segnet data layer that has two tops ,data and label given from the same source file on the same line img1.png lagel1.png

    layer {
  name: "data"
  type: "Data"
  top: "data"
  top: "label"
  dense_image_data_param {
    source: "train.txt" # train file format img1.png img1.png (data and label)
    batch_size: 1               
    shuffle: true
  }

and these from Pascal-object-detection-fc

layer {
  name: "loss"
  type: "SoftmaxWithLoss"
  bottom: "fc8_pascal"
  bottom: "label"
}
layer {
  name: "accuracy"
  type: "Accuracy"
  bottom: "fc8_pascal"
  bottom: "label"
  top: "accuracy"
  include {
    phase: TEST
  }
}

that have two bottoms.

I would appreciate if someone could explain the implication of having those into one layer and why not have different layer for each. Another thing,if a layer reference its self as top/bottom,does it mean it does not do forward/backward calculations?

Eliethesaiyan
  • 2,327
  • 1
  • 22
  • 35
  • what do you mean by "if a layer reference its self as top/bottom"?? – Shai May 10 '17 at 08:07
  • in the example i gave...accuracy layer has top: accuracy, it kind of confuses me it is point it as the one processing data from its own bottom or just to say that is the top of fc8_pascal? – Eliethesaiyan May 10 '17 at 08:15
  • `"bottom"` and `"top"` are names of data blobs processed by the net. You should **not** confuse their names with the layer's `"name"`. Layer's `"name"` is used to reference internal layer's parameters (if any). These parameters are **different** than processed data. – Shai May 10 '17 at 08:20

1 Answers1

2

Think of a layer as a mathematical operation: each layer type performs different operation. "Convolution" layer convolves the input with the layer's internal parameters, "ReLU" performs linear rectification etc.
Some operations do not require any inputs ("bottom"s): these are usually input layers that brings data/labels into the net.
Other layers only act on a single operand (one "bottom") and outputs a single result (one "top"): "Convolution", "ReLU", "Softmax" just to name a few.
Other layers may produce several outputs (many "top"s), e.g., "Slice" layer.
And you can also find layers that takes several inputs and produce a single output, e.g., "Eltwise" layer.

Bottom line, each layer/operation requires a different number of inputs and may produce a different number of outputs. You should not confuse between input/output blobs and the layer's operation.

For more information about caffe's layers you can find at caffe.help.

Shai
  • 111,146
  • 38
  • 238
  • 371
  • thank you very much for your clarification...if a layer references itself(name:loss", top:loss") does it means that it has no other layer to foward to? – Eliethesaiyan May 10 '17 at 08:21
  • @Eliethesaiyan it only means the layer's name is the same as a blobs name. no biggie. **If** you see the same `"bottom"` and `"top"` name of the same layer, then you are looking at an ["in-place" layer](http://stackoverflow.com/q/38474899/1714410). – Shai May 10 '17 at 08:24
  • ah....thank you very much...i think the in-place layer were the one confusing me...i didnt cross them before – Eliethesaiyan May 10 '17 at 08:27
  • ...one last question...if a layer use another name for top/bottom that has the name of another layer...is it referencing to that layer`s blob? – Eliethesaiyan May 10 '17 at 08:29
  • do not confuse layer names and blob names. they are two *separate* things. – Shai May 10 '17 at 08:34
  • @Shai...i understand....so far to what was related to my question....just doing some confirmation...that blob are independent of layers....such as blob : label is accessed by loss layer even if it was defined in input layer? – Eliethesaiyan May 10 '17 at 08:38
  • the `"labe"` blob is the `"top"` of `"input"` layer - that is `"input"` layer "generates" label information. Then `"loss"` layer uses `"label"` blob as an input it requires to compute the loss. – Shai May 10 '17 at 08:59