A Two-Stage Algorithm to Reduce Encoding Delay of Turbo Source Coding
Lossless turbo source coding employs an iterative encoding algorithm to search for the smallest codeword length that guarantees zero distortion. Although such encoder achieves promising compression rates, running the iterative algorithm for each individual message block imposes a large delay on the system. To reduce this delay, the authors propose a two-stage encoding algorithm for turbo source coding. They show that converging to zero distortion after a definite number of iterations, can be predicted from the earlier behavior of the distortion function. This will enable one to produce a quick, and yet sufficiently accurate, estimate of the codeword length in the first encoding stage.