We discuss a lot of topics and use measures of entropy to determine how difficult it is for an attacker to be successful. What does entropy mean in the context of cryptography? How is entropy calculated in the general case?
2 Answers
In information theory, entropy is the measure of uncertainty associated with a random variable. In terms of Cryptography, entropy must be supplied by the cipher for injection into the plaintext of a message so as to neutralise the amount of structure that is present in the unsecure plaintext message. How it is measured depends on the cipher.
Have a look at the following article. It provides a survey of entropy measures and their applications in cryptography: "Entropy Measures and Unconditional Security in Cryptography (1997)" (by Christian Cachin).
- 103
- 4
To give you a short answer to your question: The common notion of entropy is the notion of Shannon entropy. The information content $H_x$ of a value $x$ that occurs with probability $\Pr[x]$ is $$H_x = -\log_2(\Pr[x]) \text.$$ The entropy of a random source is the expected information content of the symbol it outputs, that is $$H(X) = E[H_X] = \sum_x \Pr[x]H_x = \sum_x -\Pr[x]\log_2(\Pr[x]) \text.$$
You can interpret this as the expected uncertainty about a symbol $x$ knowing only the distribution according to which $x$ is chosen. For the case of bit strings the important message is, the entropy not always equals the bit length of $x$. I.e. if you want to seed a pseudo-random generator, it is important to choose a seed with high entropy. If you use the actual time as seed the entropy of the seed is very low, as everybody can easily guess parts of the seed (year, day, perhaps even hour and minute). Ask Netscape, they should have learned this…
There are several other notions of entropy all discussed in the other answer, but I think this is the most important one, if you don't want to study complexity or information theory.