I found one approach, although we get a $1/(np)$ term rather than $1/n$.
According to [1], for any integer-valued random variable $X$ (including a Binomial) with variance $V$, we have
\begin{align}
H(X) &\leq \frac{1}{2} \log_2 \left[ 2\pi e \left(V + \frac{1}{12}\right) \right] .
\end{align}
This is already very nice/useful, and probably often quite tight for a Binomial except maybe cases where $np = O(1)$ as $n \to \infty$. (The intuition is that Gaussian distributions maximize entropy for a fixed variance, and the Binomial approaches the Gaussian.) But we can also rearrange:
\begin{align}
H(X) &\leq \frac{1}{2} \log_2 \left[ 2\pi e V \left(1 + \frac{1}{12V}\right) \right] \\
&= \frac{1}{2} \log_2 \left[ 2 \pi e V \right] + \frac{1}{2} \log_2 \left[ 1 + \frac{1}{12V} \right] \\
&\leq \frac{1}{2} \log_2 \left[ 2\pi e V \right] + \frac{1}{24V\ln(2)}
\end{align}
using that $\ln(1+x) \leq x$, so $\log_2(1+x) = \ln(1+x)/\ln(2) \leq x/\ln(2)$.
In particular for Binomial$(n,p)$, the variance is $np(1-p)$ so this gives
$$ H(X) \leq \frac{1}{2} \log_2\left[2\pi e np (1-p) \right] + \frac{1}{24\ln(2) np(1-p)}. $$
Edit The paper with the best bounds I could find is [2], but they are somewhat complex. The additive $O(1/\text{variance})$ term seems unlikely to be avoidable.
[1] "On the Entropy of Integer-Valued Random Variables". Massey, 1988. http://www.isiweb.ee.ethz.ch/archive/massey_pub/pdf/BI527.pdf
[2] "Sharp Bounds on the Entropy of the Poisson Law and Related Quantities". Adell, Lekuona, Yu. http://arxiv.org/abs/1001.2897