Joint Source-Channel Coding Revisited: Random-Coding Bounds and Error Exponents
The authors study the achievable error exponents in joint source-channel coding by deriving an upper bound on the average error probability using Gallager's techniques. The bound is based on a construction for which source messages are assigned to disjoint subsets (referred to as classes), and code-words are independently generated according to a distribution that depends on the class of the source message. Particularizing the bound to discrete memory-less systems, they show that two optimally chosen classes and product distributions are necessary and sufficient to attain the sphere-packing exponent in those cases where it is tight. Finally, they prove that the very same results extend to lossy joint source-channel coding for sources and distortion measures that make the source reliability function convex.