2

This is a very basic and probably wrong calculation for some parameters related to my gym locker (I am no mathematician, but a programmer).

I was typing in the password in my locker's keypad, and it had 10keys (0 to 9), but because some didn't respond, I just typed XXYY. This just let me thinking about calculating some properties of this particular system.

We could order the system as 10 flag bits

0000000001 --> flags (flag for 0 is on) 
0000000001 ...
0000000100 --> flags (flag for 2 is on)
0000000100 ...
9876543210 --> reference

the number of possible passwords would be $10^4$, and that would be $10^4*10b = 100Kb$ in total.

If someone knew that I typed XXYY then this would be $10^2*10b=1Kb$

  • But how do you count this information i.e the fact that it was 4 digits at first, 2 then? Do measurements of information have 'a starting point'?

So for simple systems we could easily calculate the number of possible values and the amount of information needed to break it.

My questions are:

  • Is this "reasoning" or beginner idea correct for estimating information? Otherwise, how would you fix it for this simple system?
  • Can we calculate the entropy from those numbers?
  • How important is to do this calculations in binary?

1 Answers1

2

The information (or Shannon) entropy of a system with $n$ outcomes that each occurs with probability $p_i$ is $$ H=-\sum_{i=1}^np_i\log_b(p_i) $$ where $b$ is the basis of the logarithm. It is not important what basis we take. This just determines the units of $H$. Commonly one takes $b=2$ which gives $H$ in bits.

Looking at a system with two outcomes, say, $0$ and $1$ where the probability of $1$ is $p$ gives $$ H=-p\log_2(p)-(1-p)\log_2(1-p)\,. $$ This has a maximum for $p=1/2$ and is zero for $p=0$ or $p=1\,.$ This means intuitively that when $p=0$ or $p=1$ your system is not random and the observation of an outcome gives you zero extra information. It gives you the most information when the system is most random: $p=1/2\,.$

If you assume that the $10^4$ possible passwords are all equally likely $p_i=1/n=10^{-4}\,,$ then the entropy of that system is $$ H\approx 13.288\text{ bits }. $$ With $\log_{10}$ we get this as $$ H=4\text{ dits } $$ which makes a lot of sense because you have to give me four digits of information so that I can surely break into your locker.

Kurt G.
  • 17,136