Functional Properties of Minimum Mean-Square Error and Mutual Information
In addition to exploring its various regularity properties, the authors show that the Minimum Mean-Square Error (MMSE) is a concave functional of the input - output joint distribution. In the case of additive Gaussian noise, the MMSE is shown to be weakly continuous in the input distribution and Lipschitz continuous with respect to the quadratic Wasserstein distance for peak-limited inputs. Regularity properties of mutual information are also obtained. Several applications to information theory and the central limit theorem are discussed.