Let $a_n$ be a real valued sequence. Assume that $a_n\to0$ as $n\to\infty$. Use this to prove that $(1+a_n/n)^n\to1$ as $n\to\infty$.
This proof would be simple if it wasn't for that pesky exponent. I suspect that I"m supposed to use the Bernoulli inequality to solve this somehow, along with monotone convergence theorem. But I'm just really struggling to make any progress with this. Any advice?