I am having a set of random numbers and I want to calculate entropy of them. I searched many entropy calculation formula, but I didn't get it. Can you elaborate a little bit?
1 Answers
In a cryptographic sense this is not really possible. The entropy of the numbers is determined by the way they have been chosen. From only a list of numbers, say $(1, 2, 3, 4)$, we cannot just determine the entropy.
But if we instead say that we choose four numbers uniformly from 1 to 10, we can calculate the entropy. Recall the definition of (Shannon) entropy:
$$ H(X) = -\sum_{i=1}^n {\mathrm{P}(x_i) \log_b \mathrm{P}(x_i)} $$
Here $X$ is the state (e.g. $(1, 2, 3, 4)$). The $x_i$ values are the possible states, e.g. $\{(1, 1, 1, 1), (1, 1, 1, 2), \ldots, (10, 10, 10, 10)\}$. $P(x_i)$ is the probability that the random number $X = x_i$. When the probability distribution is uniform, then the probabilities are all equal to $\frac{1}{n}$.
Notice that this is a calculation not over the state itself, but over the probability distribution of all the possible states. In other words: The entropy is determined not on what the numbers are, but how they are chosen.
[More information on how entropy is actually calculated.]
It is possible to estimate (not calculate) the entropy of a series of data, but this is more relevant in the field of data processing. This is not relevant in cryptography.
- 1,185
- 10
- 27