Given a discrete random variable $X$, which takes values in the alphabet $\mathcal {X}$ and is distributed according to $p:{\mathcal {X}}\to [0,1]$ the Shannon entropy is defined as: $$\mathrm {H} (X):=-\sum _{x\in {\mathcal {X}}}p(x)\log p(x)$$ As we know (see e.g. Prove the maximum value of entropy function) the maximum value of the Shannon entropy is $\ln N$ where $N=\operatorname{card}(\mathcal X)$.
The Rényi entropy of order $\alpha$, where $\alpha \geq 0$ and $\alpha \neq 1$, is defined as $$\mathrm {H} _{\alpha }(X)={\frac {1}{1-\alpha }}\log {\Bigg (}\sum _{i=1}^{N}p_{i}^{\alpha }{\Bigg )}$$ My question is: is it possible to find an upper bound also for the Rényi entropy?