Differential privacy is a property that seeks to characterize privacy in data sets. It is formulated as a query-response method, and computationally achieved by output perturbation. Several noise-addition methods to implement such output perturbation have been proposed in the literature. The authors focus on data-independent noise, that is, noise whose distribution is constant across data sets. Their goal is to find the optimal data-independent noise distribution to achieve differential privacy. They propose a general optimality criterion based on the concentration of the probability mass of the noise distribution around zero, and they show that any noise optimal under this criterion must be optimal under any other sensible criterion.