Differential Privacy versus Quantitative Information Flow

Abstract : Differential privacy is a notion of privacy that has become very popular in the database community. Roughly, the idea is that a randomized query mechanism provides sufficient privacy protection if the ratio between the probabilities of two different entries to originate a certain answer is bound by e^ε. In the fields of anonymity and information flow there is a similar concern for controlling information leakage, i.e. limiting the possibility of inferring the secret information from the observables. In recent years, researchers have proposed to quantify the leakage in terms of the information-theoretic notion of mutual information. There are two main approaches that fall in this category: One based on Shannon entropy, and one based on Rényi's min entropy. The latter has connection with the so-called Bayes risk, which expresses the probability of guessing the secret. In this paper, we show how to model the query system in terms of an information-theoretic channel, and we compare the notion of differential privacy with that of mutual information. We show that the notion of differential privacy is strictly stronger, in the sense that it implies a bound on the mutual information, but not viceversa.
Complete list of metadatas

Cited literature [22 references]  Display  Hide  Download

Contributor : Catuscia Palamidessi <>
Submitted on : Monday, December 20, 2010 - 2:35:38 AM
Last modification on : Wednesday, March 27, 2019 - 4:41:28 PM
Long-term archiving on : Monday, March 21, 2011 - 2:38:20 AM


Files produced by the author(s)


  • HAL Id : hal-00548214, version 1
  • ARXIV : 1012.4250



Mário Alvim, Konstantinos Chatzikokolakis, Pierpaolo Degano, Catuscia Palamidessi. Differential Privacy versus Quantitative Information Flow. 2010. ⟨hal-00548214⟩



Record views


Files downloads