I find Borel's theorem which asserts that almost all real numbers are normal very counter intuitive. If I think about the interval [0,1) and I imagine the number represented by infinite decimal digits [lets assume base 10] 0.xxxxxxx..... where each x is one digit of the infinite decimal representation of the number in the interval. (x is a digit from 0..9)
Then it seems very obvious to me that most of the numbers in the interval do not have the distribution properties of a normal number (like all digits 0..9 show up 1/10th of the time, 00..99 show up 1/100 of the time,etc).
See, a simple experiment, using lets say only 10 decimal digits 0.xxxxxxxxxx and filling up the decimal digits with all possible values we get the numbers 0.00000000001, 0.00000000002, etc until 0.9999999999. By simple counting, we see that the numbers who have the property of being normal are a very small subset of all possible numbers (they will only be the permutations of 0.0123456789).
This is even easier to see if we take base 2 for instance. Suppose that we experiment with all possible combinations of 2 digits in base 2. We have 0.xx, so we have 4 possibilities: 0.00, 0.01, 0.10, 0.11
Only half of the numbers (0.01 and 0.10) 'are on the way to produce a normal number'.
I know that to be normal we need to have infinite number of digits but I can not see how making the number of digits infinite will overcome this phenomenon described above. That is, 0.xxxxxxx..... should not have nicely distributed digits most of the time.
Could you please point where i am wrong?