Try pruning here

Knowledge Distillation

Knowledge distillation is a technique used to transfer the knowledge from a larger, complex model into a smaller, more efficient one. This process helps in improving the performance of the smaller model by teaching it to mimic the behavior of the larger one.

Coming Soon!