skip to content

Lossy Compression Coding Theorems for Arbitrary Sources

Presented by: 
Yiannis Kontoyiannis
Monday 23rd July 2018 - 11:00 to 11:45
INI Seminar Room 1
We give a development of the theory of lossy data compression from the point of view of statistics. This is partly motivated by the enormous success of the statistical approach in lossless compression. A precise characterization of the fundamental limits of compression performance is given, for arbitrary data sources and with respect to general distortion measures. The emphasis is on non-asymptotic results and results that hold with high probability (and not just on the average). The starting point for this development is the observation that there is a precise correspondence between compression algorithms and probability distributions (in analogy with the Kraft inequality in lossless compression). This leads us to formulate a version of the celebrated Minimum Description Length (MDL) principle for lossy data compression. We discuss the consequences of the lossy MDL principle, and we explain how it can lead to practical design lessons for vector quantizer design.
University of Cambridge Research Councils UK
    Clay Mathematics Institute London Mathematical Society NM Rothschild and Sons