Home
Research
Projects
Thesis
News
Events
Publications
People
Contact
Andrea Bragagnolo
Latest
Shannon Strikes Again! Entropy-based Pruning in Deep Neural Networks for Transfer Learning under Extreme Memory and Computation Budgets
Efficient Inference Of Image-Based Neural Network Models In Reconfigurable Systems With Pruning And Quantization
Loss-based sensitivity regularization: towards deep sparse neural networks
Method and apparatus for pruning neural networks
Simplify: A python library for optimizing pruned neural networks
To update or not to update? Neurons at equilibrium in deep models
On the role of structured pruning for neural network compression
Serene: Sensitivity-based regularization of neurons for structured sparsity in neural networks
Pruning artificial neural networks: A way to find well-generalizing, high-entropy sharp minima
Cite
×