Entropy Coding

Entropy coding aims to represent data using the fewest bits possible, matching code lengths to the probabilities of data symbols. Current research focuses on improving efficiency for diverse data types, including unordered structures (like graphs), point clouds, and even compressed file formats, often employing techniques like bits-back coding, attention mechanisms within octree models, and learned entropy models integrated with neural networks (e.g., transformers, convolutional autoencoders). These advancements are crucial for reducing storage needs and communication bandwidth in various applications, from large language models and video compression to efficient federated learning and medical imaging.

Papers