Derivative of Mutual Information at Zero SNR: The Gaussian-Noise Case

Free registration required

Executive Summary

The asymptotics of input-output mutual information of a class of channels with weak input has been investigated in the past. This paper studies the mutual information between a random variable and its observation in Gaussian noise with low Signal-to-Noise Ratio (SNR).Assuming additive Gaussian noise, a general sufficient condition on the input distribution is established to guarantee that the ratio of mutual information to Signal-to-Noise Ratio (SNR) goes to one half nat as SNR vanishes. The result allows SNR-dependent input distribution and side information.

  • Format: PDF
  • Size: 138.74 KB