0

I am trying to apply a very aggressive pruning to a Keras model that is used to recognise the GTSRB dataset. I applied the following pruning parameters

pruning_params = {
      'pruning_schedule': tfmot.sparsity.keras.PolynomialDecay(initial_sparsity=0,
                                                               final_sparsity=0.999,
                                                               begin_step=0,
                                                               end_step=end_step,
                                                               frequency=1)
}

Which I believed should zeroes out a lot of weights and thus reduce the accuracy of the model by a lot. Contrary to my beliefs the accuracy drop is almost insignificant when I evaluate the model against the validation dataset, giving me a result equal to 91%.

My question is: am I applying pruning in a wrong way or it might be that the model is inherently redundant and thus most of the weights are useless to ensure a correct prediction?

0 Answers0