I was thinking about the entropy of a binomial distribution. The Wikipedia page on Binomial distribution says that the entropy of the Binomial(n,p) is asymptotically $\frac{1}{2} \log _{2}(2 \pi e n p(1-p))+O\left(\frac{1}{n}\right)$.
I noted that in non-asymptotic regime, there exists lower bounds and upper bounds for the binomial entropy as can be seen here.
I was wondering whether Binomial entropy is an increasing (strictly speaking, non-decreasing) function of $n$ in general? My intuitive answer is yes. But, I can't quite prove this. Can someone help me prove this.
My attempt: $H(X+Y)=H(X)+H(Y)-H(X|X+Y)$ for $X,Y$ IID Bernoulli RVs. Also conditioning reduces entropy. Iterate this argument $n$ times to reach Binomial. Is this right?