Suppose $f''(x) \neq 0$ for all $x$. Then $f'$ must be injective (otherwise
the mean value theorem shows that $f''(\xi)=0$ at some point). Since $f'$ is continuous, it must be increasing or decreasing. Since we can deal with $f$ or $-f$, we might as well assume that
$f'$ is increasing. In particular, this means that $f''(x) >0$ for all $x$,
and so $f$ is convex.
Since $f$ is convex, the set $f^{-1}((-\infty,f(a)])$ is convex and contains
$[a,\infty)$, so $f(x) \le f(a)$ for all $x$. Since $f$ is not constant (otherwise $f''(x) = 0$ everywhere), $f$ has a global minimum at some point $\hat{x} \in (a,\infty)$, and we have $f(y)>f(\hat{x})$ for some $y > \hat{x}$. Now let $z >y$, then
$y = (1-t)\hat{x}+t z$ for some $t \in (0,1)$ and
$f(y) \le (1-t)f(\hat{x})+t f(z)$, since $f$ is convex. Solving for $t$ and rearranging gives
$f(z)= \left( { z-\hat{x} \over y-\hat{x} } \right) (f(y)-f(\hat{x})) +f(\hat{x})$, and so $\lim_{z \to \infty} f(z) = \infty$, a contradiction.
Hence we must have $f''(x) =0$ at some point.