I am trying to train a multi-output convolutional neural network. The network takes one input, processes the input via a common stem, then branches out into different heads, with one output per head. Currently I am training the network with one loss per head.
I'm seeing some possible competition between heads (i.e. decreasing the loss for one output increases the loss for another), so I'd like to try if I can only apply to each head its associated loss, and then train the common stem on the combined losses like I am now for the whole network.
The losses are currently passed to model.compile as a dictionary item, but apparently that results in training the whole network on the sum of all the losses.
I didn't try doing a custom training loop yet because I'm not comfortable enough with Tensorflow to do it if there are other solutions; I know how it works but I wanted to see if there are somewhat easier solutions.