Security

Privacy Against Statistical Inference

Download Now Free registration required

Executive Summary

The authors propose a general statistical inference framework to capture the privacy threat incurred by a user that releases data to a passive but curious adversary, given utility constraints. They show that applying this general framework to the setting where the adversary uses the self-information cost function naturally leads to a nonasymptotic information-theoretic approach for characterizing the best achievable privacy subject to utility constraints. Based on these results they introduce two privacy metrics, namely average information leakage and maximum information leakage. They prove that under both metrics the resulting design problem of finding the optimal mapping from the user's data to a privacy-preserving output can be cast as a modified rate-distortion problem which, in turn, can be formulated as a convex program.

  • Format: PDF
  • Size: 140.17 KB