Saddle Point in the Minimax Converse for Channel Coding
A mini-max meta-converse has recently been proposed as a simultaneous generalization of a number of classical results and a tool for the non-asymptotic analysis. In this paper it is shown that the order of optimizing the input and output distributions can be interchanged without affecting the bound. In the course of the proof, a number of auxiliary results of separate interest are obtained. In particular, it is shown that the optimization problem is convex and can be solved in many cases by the symmetry considerations. As a consequence it is demonstrated that in the latter cases the (multi-letter) input distribution in information-spectrum (Verd-u-Han) converse bound can be taken to be a (memoryless) product of single-letter ones.