The entropy of a uniform distribution is $ ln(b-a)$. With $a=0$ and $b=1$ this reduces to zero. How come there is no uncertainty?
1 Answers
Continuous entropy doesn't have quite the same meaning as discrete entropy. For example, we could also take $a = 0$ and $b = 1/2$, giving entropy $-\ln(2) < 0$, where as in the discrete case entropy is always non-negative. Note that a lot of the difference comes from the fact that a probability density function (pdf) can be greater than one, on a set of measure (size) less than 1, though, so that the integral is 1.
Check out the WolframAlpha entry on it: Differential Entropy. Also, here is the Wikipedia entry on it: Differential Entropy.
Compare this with the discrete distribution: Suppose we have $P(X = x_n) = 1/N$ where X takes the values $\{ x_1, ..., x_N \}$. This gives entropy $$H(X) = -\sum_{n=1}^N P(X=X_n) \log_2 P(X = X_n) = -\sum_{n=1}^N {1 \over N} \log_2 {1 \over N} = N \cdot {1 \over N} \log_2 N = \log_2 N.$$ Note that this is actually the maximal value for the entropy - this can be shown using Gibbs' inequality, or just by finding the maximum of the function $f(x) = -x \ln x$ (eg by differentiating and solving $f'(x) = 0$), and observing that $$\log_2 x = {\ln x \over \ln 2}.$$
Hope this helps! If it does, remember to upvote! ;)
- 6,892
-
1Great, I'll upvote you once I have some reputation. – log2 Feb 19 '15 at 18:45
-
5Nice one, thanks. It's a good question. I hadn't thought about it before. In fact, I started writing the answer quite differently, aiming to show that you'd got the entropy wrong! I did the discrete case, then was going to say "and the continuous case follows similarly", but thought that I'd just do it anyway as it's easy. I then realised what you did! Very interesting! :) – Sam OT Feb 19 '15 at 22:21
-
Awesome answer! Very informative :) I'm actually here for the discrete formula, so I'm glad you included it! – drevicko Aug 03 '22 at 04:41