Journal of Machine Learning Research (JMLR)
Differential privacy is a rigorous cryptographically-motivated characterization of data privacy which may be applied when releasing summaries of a database. Previous paper has focused mainly on methods for which the output is a finite dimensional vector, or an element of some discrete set. The authors develop methods for releasing functions while preserving differential privacy. Specifically, they show that adding an appropriate Gaussian process to the function of interest yields differential privacy. When the functions lie in the Reproducing Kernel Hilbert Space (RKHS) generated by the covariance kernel of the Gaussian process, then the correct noise level is established by measuring the \"Sensitivity\" of the function in the RKHS norm.