Conditional Data Compression: The Lossless Case

B. Carpentieri (Italy)


Data Compression; Conditioned entropy; Learning.


Nowadays the state of the art lossless data compression algorithms are very close to their theoretical limit: the entropy of the transmitting source. The demand for efficient lossless compression, on the other hand, is rapidly increasingly together with the huge diffusion of digital data. To increase the possibility of compression, one option we have is to use our knowledge of similar messages that we have been already compressed from the same source and to design algorithms that efficiently compress or decompress given the past knowledge: in this way the new theoretical limit, the conditional entropy, allows for better compression. In this paper we discuss this possibility and show preliminary and promising experimental results in the compression of large binary files and images.

Important Links:

Go Back