9

I'm reading a paper that refers to the limit as n goes to infinity of Rényi entropy. It defines it as ${{H}_{n}}\left( X \right)=\dfrac{1}{1-n} \log_2 \left( \sum\limits_{i=1}^{N}{p_{i}^{n}} \right)$. It then says that the limit as $n\to \infty $ is $-\log_2 \left( p_1 \right)$. I saw another article that uses the maximum of the ${{p}_{i}}'s$ instead of ${{p}_{1}}$. I think that this works out fairly easily if all of the ${{p}_{i}}'s$ are equal (a uniform distribution). I have no idea how to prove this for anything other than a uniform distribution. Can anyone show me how it's done?

Ran G.
  • 20,884
  • 3
  • 61
  • 117
Mitchell Kaplan
  • 475
  • 3
  • 8

1 Answers1

8

Suppose $p_1 = \max_i p_i$. We have $$ p_1^n \leq \sum_{i=1}^N p_i^n \leq N p_1^n. $$ Therefore $$ \frac{n \log p_1 + \log N}{1-n} \leq H_n(X) \leq \frac{n \log p_1}{1-n}. $$ As $n \rightarrow \infty$, $\log N/(1-n) \rightarrow 0$ while $n/(1-n) \rightarrow -1$.

Gilles 'SO- stop being evil'
  • 44,159
  • 8
  • 120
  • 184
Yuval Filmus
  • 280,205
  • 27
  • 317
  • 514