GABIC: GRAPH-BASED ATTENTION BLOCK FOR IMAGE COMPRESSION

Abstract

While standardized codecs like JPEG and HEVC-intra represent the industry standard in image compression, neural Learned Image Compression (LIC) codecs represent a promising alternative. In detail, integrating attention mechanisms from Vision Transformers into LIC models has shown improved compression efficiency. However, extra efficiency often comes at the cost of aggregating redundant features. This work proposes a Graph-based Attention Block for Image Compression (GABIC), a method to reduce feature redundancy based on a k-Nearest Neighbors enhanced attention mechanism. Our experiments show that GABIC outperforms comparable methods, particularly at high bit rates, enhancing compression performance. © 2024 IEEE

Publication
Proceedings - International Conference on Image Processing, ICIP
Alberto Presta
Alberto Presta
Former member
Marco Grangetto
Marco Grangetto
Full Professor
Attilio Fiandrotti
Attilio Fiandrotti
Associate Professor