0

Why do differential-privacy people care whether or not the noise function saturates the lower bound of Shannon entropy?

For example : Laplace distribution that is used to model the noise function happens to saturate the lower bound of Shannon entropy under epsilon differential privacy constraints. See Theorem 8 in Yu Wang, Zhenqi Huang, Sayan Mitra and Geir E. Dullerud, Entropy-minimizing Mechanism for Differential Privacy of Discrete-time Linear Feedback Systems for an example.

But what is the significance of this saturation?

user6818
  • 1,165
  • 8
  • 13

1 Answers1

1

In general, if you're not sure about the significance of mathematical results in a published paper, often a good first place to look is the paper's introduction and/or conclusion. The authors often try to provide some explanation why you should care about their results in the introduction and conclusion.

In this case, the following two quotes from the introduction of the paper you cited provide a pretty clear explanation for why the authors care:

we first study an $\epsilon$-differentially private noise-adding mechanism for one-shot queries that provides the best output accuracy, which is measured by the Shannon entropy. [...]

In Section III, we prove that, for a one-shot $n$-dimensional input, there is a lower bound $n + n \ln(2\epsilon)$ on the entropy of the output for an $\epsilon$-differentially private noise-adding mechanism, and the lower bound is achieved by Laplacian noise with parameter $\epsilon$.

In other words, the Shannon entropy measures the accuracy of the output after adding the noise (lower is better); Theorem 8 proves that Laplace noise achieves the lowest possible Shannon entropy, in a particular situation; therefore, we can conclude that Laplace noise is optimal in the sense that it achieves the highest possible output accuracy, in that situation.

D.W.
  • 167,959
  • 22
  • 232
  • 500