Data Management

Lossy Compression in Near-Linear Time Via Efficient Random Codebooks and Databases

Free registration required

Executive Summary

The compression-complexity trade-off of lossy compression algorithms that are based on a random codebook or a random database is examined. Motivated, in part, by recent results of Gupta-Verdu-Weissman (GVW) and their underlying connections with the pattern-matching scheme of Kontoyiannis' lossy Lempel-Ziv algorithm, the authors introduce a non-universal version of the lossy Lempel-Ziv method (termed LLZ). The optimality of LLZ for memoryless sources is established, and its performance is compared to that of the GVW divide-and-conquer approach. Experimental results indicate that the GVW approach often yields better compression than LLZ, but at the price of much higher memory requirements. To combine the advantages of both, they introduce a HYbrid Algorithm (HYB) that utilizes both the divide-and-conquer idea of GVW and the single-database structure of LLZ.

  • Format: PDF
  • Size: 297.83 KB