For questions related to Renyi entropy. The Rényi entropy of order $α$ , where $α\geq0$ and $α\neq1$, is defined as $\displaystyle \mathrm {H} {\alpha }(X)={\frac {1}{1-\alpha }}\log {\Bigg (}\sum _{i=1}^{n}p{i}^{\alpha }{\Bigg )}$.
The Rényi entropy of order $\alpha$ , where $\alpha \geq 0$ and $\alpha \neq 1$, is defined as $$\displaystyle \mathrm {H} _{\alpha }(X)={\frac {1}{1-\alpha }}\log {\Bigg (}\sum _{i=1}^{n}p_{i}^{\alpha }{\Bigg )}$$ Here, $X$ is a discrete random variable with possible outcomes in the set $\displaystyle {\mathcal {A}}=\{x_{1},x_{2},...,x_{n}\}$ and corresponding probabilities $\displaystyle p_{i}\doteq \Pr(X=x_{i})$ for $i=1,\dots ,n$. The logarithm is conventionally taken to be base $2$, especially in the context of information theory where bits are used.