$\newcommand\P{\mathbb{P}}\newcommand\E{\mathbb{E}}\newcommand\ol{\overline}$Write $\ol X_{\max} = \max\{\ol X_i\} = \max\{\tfrac{1}{n} X_i\}$. We can compute
$$
\P(\ol X_{\max} > p + t) = \P(\ol X_i > p + t \text{ for some } i=1,\ldots,k) \leq k\P(\ol X_1 > p + t) \leq k\,e^{-2nt^2}
$$
by the union bound and Hoeffding's inequality. Denoting $(x)_+ = \max\{x,0\}$,
this implies
$$
\E \ol X_{\max} \leq p + \E(\ol X_{\max} - p)_+ = p + \int_0^\infty \P(X_{\max} > p + t)dt \leq p + k\sqrt{\frac{\pi}{8n}}
$$
In terms of the original variables $X_{\max} = n\ol X_{\max}$, we have
$$
\E X_{\max} \leq np + k\sqrt{\frac{n\pi}{8}} \leq np + \tfrac{2}{3}k\sqrt{n}.
$$
Note that this is much weaker in terms of $k$ than the asymptotically correct bound
$$
\E X_{\max} \asymp np + \sqrt{2p(1-p)} \sqrt{n\log k}
$$
based on a normal approximation and the fact that if $Z_i\sim\mathcal{N}(0,1)$, $\E Z_{\max} \asymp \sqrt{2\log k}$, but it does give the right dependence on $n$.