Shannon Strikes Again! Entropy-based Pruning in Deep Neural Networks for Transfer Learning under Extreme Memory and Computation Budgets

Abstract

Deep neural networks have become the de-facto standard across various computer science domains. Nonetheless, effectively training these deep networks remains challenging and resource-intensive. This paper investigates the efficacy of pruned deep learning models in transfer learning scenarios under extremely low memory budgets, tailored for TinyML models. Our study reveals that the source task’s model with the highest activation entropy outperforms others in the target task. Motivated by this, we propose an entropy-based Efficient Neural Transfer with Reduced Overhead via PrunIng (ENTROPI) algorithm. Through comprehensive experiments on diverse models (ResNet18 and MobileNet-v3) and target datasets (CIFAR-100, VLCS, and PACS), we substantiate the superior generalization achieved by transfer learning from the entropy-pruned model. Quantitative measures for entropy provide valuable insights into the reasons behind the observed performance improvements. The results underscore ENTROPI’s potential as an efficient solution for enhancing generalization in data-limited transfer learning tasks.

Publication
ICCVW
Gabriele Spadaro
Gabriele Spadaro
PhD Student
Riccardo Renzulli
Riccardo Renzulli
Postdoc

Make Earth Green Again 🌱

Attilio Fiandrotti
Attilio Fiandrotti
Associate Professor
Marco Grangetto
Marco Grangetto
Full Professor