2

The heading tells everything. Any proof or any kind of reference is welcome regarding this:

Can we get more randomness from a deterministic random bit generator than the entropy that we feed to the generator?

R.Ali
  • 21
  • 4

1 Answers1

6

Define the Mutual Information of a pair of random variables. $$I(X; Y) = H(X) - H(X\mid Y)$$ For discrete random variables we hae that $H(X\mid X) = 0$, so: $$I(X; X) = H(X)$$

The Data-Procesing Inequality states that for any function $f$, we have that: $$I(f(X); f(Y)) \leq I(X; Y)$$ While we won't need it here, this includes randomized functions, provided they use randomness independent of either $X$ or $Y$. Combining these two, we get that:

$$H(f(X)) - H(f(X)\mid f(X)) \leq H(X) - H(X\mid X)$$ If $f$ is deterministic, I believe the conditional entropies become 0, giving us: $$H(f(X)) \leq H(X)$$ So the output distribution of a DRBG is always lower entropy than the distribution that its seed is drawn from.

Mark Schultz-Wu
  • 15,089
  • 1
  • 22
  • 53