2

I am trying out neural compressor (intel LPOT) to reduce the size of my CNN model implemented in pytorch. I intend to do distillation

The below is the code used to distill the model.

    from neural_compressor.experimental import Distillation, common
    from neural_compressor.experimental.common.criterion import PyTorchKnowledgeDistillationLoss
    distiller = Distillation(args.config)
    distiller.student_model = model
    distiller.teacher_model = teacher
    distiller.criterion = PyTorchKnowledgeDistillationLoss()
    distiller.train_func = train_func
    model = distiller.fit()

I wanted to change the loss fucntion to a different loss function i.e. I need to give a custom loss function which I have implemented in pytorch. Currently I see in the neural compressor I could change the loss function of teacher and student by providing arguments to the distiller.criterion i.e. by

    distiller.criterion = PyTorchKnowledgeDistillationLoss(loss_types=['CE', 'KL']) 

I assume this works because KullbackLeiblerDivergence and cross entropy loss are available in neural compressor is there any way to provide my custom loss function to distiller.criterion?

cherrywoods
  • 1,284
  • 1
  • 7
  • 18
ArunJose
  • 1,999
  • 1
  • 10
  • 33

1 Answers1

0

In neural compressor source there is a class called PyTorchKnowledgeDistillationLoss which has SoftCrossEntropy and KullbackLeiblerDivergence as member functions if you want to give your own custom loss function add a new member function to PyTorchKnowledgeDistillationLoss class, which takes in togits and targets as parameters,

eg

class PyTorchKnowledgeDistillationLoss(KnowledgeDistillationLoss):
...
...
    def customLossFunction(self, logits, targets):
        //calculate the custom loss
        return custom_loss

And then init function(constructor) of the PyTorchKnowledgeDistillationLoss assign

self.teacher_student_loss = self.customLossFunction
self.student_targets_loss= self.customLossFunction
AlekhyaV - Intel
  • 580
  • 3
  • 21