The Kullback-Leibler Divergence (KL Divergence) is defined as follows, for the distributions $P,Q$ and the outcome space $\mathcal X$, $$D_{\text{KL}}(P\parallel Q)=\mathbb E_{x\sim P}\left[\log\dfrac{P(x)}{Q(x)}\right]=\sum_{x\in \mathcal X} P(x)\log\left(\dfrac{P(x)}{Q(x)}\right).$$ I saw somewhere said that with a little math (I retain here that $\mathcal X$ is a space of discrete RV) if $$D_{\text{KL}}(P\parallel Q)=0\Longrightarrow P,Q \text{ are the exact same distribution. }$$
I'm not sure with the right term above but maybe this? For discrete RV $X$ where $P(\cdot=\cdot'),Q(\cdot=\cdot')$ are the probabilty corresponding to the distribution $P,Q$ respectively, then $\forall x\in\mathcal X$ $$P(X=x)=Q(X=x).$$ I tried to prove this claim but I think I could only for the rational probability. How can we prove it for all cases? I wrote here the case of $2$ possible outcomes, i.e. the number of outcomes $\#\mathcal X=2$ and this method can extended to any finite number.
Therefore, let $P$s, $Q$s be the probabilities, $$D_{\text{KL}}(P\parallel Q)=P_1\log\dfrac{P_1}{Q_1}+P_2\log\dfrac{P_2}{Q_2}, P_1+P_2=1=Q_1+Q_2$$
$$D_{\text{KL}}(P\parallel Q)=0\Longleftrightarrow\log\left[\left(\dfrac{P_1}{Q_1}\right)^{P_1}\left(\dfrac{P_2}{Q_2}\right)^{P_2}\right]=0\Longleftrightarrow P_1^{P_1}P_2^{P_2}=Q_1^{P_1}Q_2^{P_2}$$ If we let $P_1=\frac{r}{q}, r<q,$ the right-handed side is, by AM-GM inequality,
$$\dfrac{1}{q^q}=\dfrac{Q_1^rQ_2^{q-r}}{r^r(q-r)^{q-r}}\le \left(\dfrac{\overbrace{\dfrac{Q_1}{r}+\dfrac{Q_1}{r}+\cdots+\dfrac{Q_1}{r}}^{r}+\overbrace{\dfrac{Q_2}{q-r}+\dfrac{Q_2}{q-r}+\cdots \dfrac{Q_2}{q-r}}^{q-r}}{q}\right)^q=\dfrac{1}{q^q}$$
Hence, by the equality of AM-GM we obtain $Q_1/r=Q_2/(q-r)\Longrightarrow P_1=Q_1$ and $P_2=Q_2$. How can we show that this is the case for irrational number? I don't know how to apply the inequality for such cases.