Networking

A New Converse in Rate-Distortion Theory

Free registration required

Executive Summary

The fundamental problem in non-asymptotic rate-distortion theory is to estimate the minimum achievable source coding rate at any given block-length k compatible with allowable distortion d. This paper shows new finite-block-length converse bounds applicable to lossy source coding as well as joint source-channel coding, which are tight enough not only to prove the strong converse, but to find the rate-dispersion functions in both setups. In order to state the converses, the authors introduce the d-tilted information, a random variable whose expectation and variance (with respect to the source) are equal to the rate-distortion and rate-dispersion functions, respectively.

  • Format: PDF
  • Size: 245.17 KB