This is a follow-up to Does Newton's forward difference formula / Newton's interpolation formula yield well-defined functions (in one variable) if it is convergent?.
Suppose we are given a sequence $a_{0}, a_{1}, a_{2}, \ldots$ of real numbers and we define $f:[0, \infty)\rightarrow\mathbb{R}$ by
$$ f(k + \alpha) = \sum_{n=0}^{\infty} \frac{\alpha^{\underline{n}}}{n!} \Delta^{n} a_{k} = \sum_{n=0}^{\infty} \binom{\alpha}{n} \Delta^{n} a_{k} $$
where $\Delta a_{k} = a_{k+1} - a_{k}$, $\Delta^{2} a_{k} = a_{k+2} - 2a_{k+1} + a_{k}$, etc.
If the sum for Newton's forward difference formula shown above has pointwise convergence for all $k+\alpha\ge 0$, is $f$ necessarily analytic on $[0, \infty)$? That is, does every point $x_{0}\in[0, \infty)$ have a neighborhood $U$ on which $f|_{U}$ is given by its Taylor series about $x_{0}$?
If the answer is yes, is there a proof of this? If the answer is no, is there a counterexample? (If a counterexample exists, I would prefer to find out what is the most pathological example out there but any example would do. The tricky part is showing that the formula converges for a candidate function.)
How much is known about this question? Are there references that discuss this?