I know theres no need to use a nn.Softmax()
Function in the output layer for a neural net when using nn.CrossEntropyLoss
as a loss function.
However I need to do so, is there a way to suppress the implemented use of softmax in nn.CrossEntropyLoss
and instead use nn.Softmax()
on my output layer of the neural network itself?
Motivation: I am using shap
package to analyze the features influences afterwards, where I can only feed my trained model as an input. The outputs however don't make any sense then because I am looking at unbound values instead of probabilites.
Example: Instead of -69.36 as an output value for one class of my model, I want something between 0 and 1, summing up to 1 for all classes. As I can't alter it afterwards, the outputs need to be like this already during training.