Optimum Discrete Signaling Over Channels With Arbitrary Noise Distribution
General channels with arbitrary noise distributions and a finite set of signaling points are considered in this paper. The authors aim at finding the capacity-achieving input distribution. As a structural result they first demonstrate that mutual information is a concave function of the input distribution and a convex function of the channel transfer densities. Using the Karush-Kuhn-Tucker theory, capacity achieving distributions are then characterized by constant Kullback-Leibler divergence between each channel transfer density and the mixture hereof built by using the probabilities as weights.