Consider a sequence of identically distributed real-valued random variables $(X_i)_{i\in\mathbb{N}}$, with $\mathbb{E}\left[X_i\right]=0$ and $\mathbb{E}\left[X_i^2\right]=1$.
Suppose that there exists some constant $c\in(0,1)$, such that $$\mathbb{E}\left[X_iX_j\right]<c^{|i-j|}$$ for any $i,j\in\mathbb{N}$ (with $i\not= j$).
Is it true that $$n^{-1/2}\sum_{i=1}^nX_i\overset{d}{\rightarrow}\mathcal{N}(0,1)$$ as $n\rightarrow\infty$?
There are many central limit theorems for dependent random variables but I have been unable to find any quite like this. I am not attached to the specific choice of covariance bound, but would love to find a result of this form that depends on nothing more than first and second moments. The motivation for the question in part comes from this nice thread on the law of large numbers:
Weak Law of Large Numbers for Dependent Random Variables with Bounded Covariance