0

It is said that the number of "information bits" contained in a certain piece of information can be roughly translated as the number of yes/no-questions that would have to be answered in order to transmit the information. But isn't this entirely dependent on the knowledge of the receiver (what questions they ask) and if so, how could one ever talk about the "objective" number of bits in a certain piece of information?

In theory, it seems to me that any amount of information could be transmitted in a single bit using the right (possibly very long) yes/no-question. Am I missing something fundamental here?

Thank you in advance!

RobPratt
  • 50,938
  • Suppose we have a five bit number and I send you one bit. What clever question can a smarter receiver ask to avoid inquiring about the next four bits? – John Douma Mar 03 '24 at 20:57
  • @JohnDouma The clever question would then be: is it [the specific five bit number]? 1 for yes, 0 for no.

    Is this not possible? Why not?

    – Xerxes123 Mar 03 '24 at 22:06
  • Entropy is about distinguishing an unknown value from a range of possibilities (with probabilities attached). If you know the only option is one specific 5 bit number then you don’t need to ask any questions. If all 5 bit numbers are equally likely then you can’t do better (on average) than asking 5 yes/no questions. If your first question is “ls your number 01101” then you’ll do worse than 5 questions on average. – Jamie Radcliffe Mar 04 '24 at 01:22
  • @Xerxes123 With only one bit of information, for which number would you ask that question? There are sixteen possibilities. – John Douma Mar 04 '24 at 06:19
  • @JamieRadcliffe Maybe I am very clever and can anticipate the number with high probability? Why is this not allowed? – Xerxes123 Mar 04 '24 at 07:22
  • @JohnDouma Must all possibilities be equally probable from my point of view? Maybe I can anticipate the number with high probability? (See also my answer to Jamie).

    I apologize if I misunderstand something important, I genuinely want to understand this.

    – Xerxes123 Mar 04 '24 at 07:24
  • @Xerxes123 In the case of a five bit number, we are assuming that each of the $32$ possibilities are equally likely. If you change the probabilities, the entropy will change but it will have nothing to do with the cleverness of the receiver. That's what you are missing. – John Douma Mar 04 '24 at 08:14
  • @JohnDouma Ok, but is it then correct to say that a given piece of information has no "objective" number of bits? My reasoning is yes (no objective number), since the number of bits depends on the probabilities and probability can be subjective (e.g. if I know that you for some reason almost always transmit the number 7, this number can be transmitted with 1 bit of information when my guess is right, whereas some other person may have no idea about your peculiar 7-habit and therefore require more bits)? – Xerxes123 Mar 04 '24 at 09:10
  • @Xerxes123 If one person doesn't know the probabilities then that person cannot compute the entropy. I suggest you look at the mathematical definition. The probabilities must be known. Entropy is not relative to an observer. I don't know your mathematical background but take a look at the wiki. – John Douma Mar 04 '24 at 09:35
  • @JohnDouma I think this explains it for me: the number of bits is only objective given fixed/known probabilities. In real world settings this may not always be the case, because there are situations where probabilities are unknown and/or subjective, right? Perhaps this explanation is obvious from the mathematics, I have very limited mathematical background. – Xerxes123 Mar 04 '24 at 10:47

1 Answers1

0

But isn't this entirely dependent on the knowledge of the receiver

Of course. If the receiver already has the full knowledge, then she needs to ask zero (not even one!) questions. If the receiver knows that the information has one two possible outcomes A/B, each equally probable, then she has to ask just one question. In the two examples above, the amount of information she attains after discovering the actual value, is respectively zero and one bit. Which is precisely the entropy.

This amount of bits/questions is "objective" in the sense that, if $n$ persons agree in the a priori knowledge they have about the unknown value (which is modelled by a probability distribution), then they must agree about the optimal (in average) amount of questions they must ask to discover the value.

leonbloy
  • 66,202