2

For a bounded sequence $(a_k)_{k=1}^\infty$ define the sequence $(A_j)_{j=1}^\infty$ by $$ A_j = \sum_{k=1}^\infty \frac{a_k}{k^2+j^4}. $$ It is elementary to check that (I cheated and used Mathematica) $$ |A_j| \leq \|a\|_\infty \sum_{k=1}^\infty \frac{1}{k^2+j^4} = \|a\|_\infty \frac{-1+j^2 \pi \operatorname{Coth}(j^2 \pi)}{2j^4}, $$ so that $$ A_j = O\left( \frac{1}{j^2} \right). $$ This bound is surely not sharp, and with the oscillating sequence $a_k = (-1)^k$ we obtain $$ A_j \sim \frac{-1}{2j^4} = \Theta\left( \frac{1}{j^{4}} \right). $$ However I am not able to go beyond $1/j^4$ (with a non-identically zero sequence).

Question: Can one find a non-identically zero sequence $(a_k)_{k=1}^\infty$ so that $$ A_j = O\left( \frac{1}{j^{4+\epsilon}} \right), $$ for some $\epsilon > 0$?

Trying with $a_k = (-1)^k k^{-n}$ I obtain that $$ A_j = \Theta\left( \frac{1}{j^{4}} \right). $$ For $a_k = (-1)^ke^{-k}$ I obtain $$ A_j = \frac{-1}{2ej^4}(\alpha_j + \beta_j) + \frac{i}{2ej^6}(\alpha_j-\beta_j) $$ where the complex numbers $\alpha_j$ and $\beta_j$ are defined by $$ \alpha_j = {}_2F_1(1,1-ij^2,2-ij^2,-1/e),\quad \beta_j = {}_2F_1(1,1+ij^2,2+ij^2,-1/e) . $$ The function ${}_2F_1$ is the hypergeometric function, and it can be shown that the sequences $(\alpha_j)$ and $(\beta_j)$ are bounded. However it is not clear to me whether $\alpha_j + \beta_j$ is zero or not.

blamethelag
  • 2,154
  • 1
    What about if you choose $a_{2k-1}=1/(4k-1)=-a_{2k}$; then together the $2k-1$ and $2k$ term adds to $1/((2k-1)^2+j^4)((2k)^2+j^4))$ so majorizing this by $1/j^8$ when $k<j^2$ and by $1/k^4$ when $k\ge j^2$ should give you an $O(1/j^6)$ – Conrad Oct 03 '24 at 18:03
  • If the bound $\sim\frac{e^{-\alpha j^2}}{j^2}$ meets your requirement, you can consider $a_k=\cos(b_1k)-\cos(b_2k),,b_{1,2}<\pi$. Then $$\sum_{k=1}^\infty\frac{\cos(b_1k)-\cos(b_2k)}{k^2+j^4}=\frac{\pi\big(\cosh \big(j^2(\pi-b_1)\big)-\cosh \big(j^2(\pi-b_2)\big)} {j^2\sinh(\pi j^2)}$$ – Svyatoslav Oct 03 '24 at 18:06
  • @Conrad This works well, I am able to go one step further and get $O(1/j^{10})$. It is not clear if this method yields arbitrary polynomial decay and I am investigating it. – blamethelag Oct 04 '24 at 09:31
  • @Svyatoslav I am not sure to understand how you got your formula. I beg to differ as the LHS is $2 \pi$ periodic with respect to $b_1$ while the RHS is not. Also what is the need to introduce two parameters? The numerator of the RHS should be equivalent to one of the two terms that is summed (hence I cannot see the impact of the other). – blamethelag Oct 04 '24 at 09:33
  • There was a typo in the formula - I missed $\frac12$. My apology. The correct formula $$\sum_{k=1}^\infty\frac{\cos(bk)}{k^2+j^4}=\frac{\pi}{2j^2}\frac{\cosh\big(j^2(\pi-b)\big)}{\sinh(\pi j^2)}-\frac1{2j^4};,,b\in[0;2\pi]$$ You are not allowed to take $b>2\pi$, but can easily check the formula numerically. The formula can be straightforwardly obtained by means of complex integration. – Svyatoslav Oct 04 '24 at 14:53
  • The reason I took the difference of two sums was to cancel $\frac1{j^4}$ term - to make the sum exponentially small at $j\to \infty$. As I do not know the goal and logic of your investigation I presumably proposed the sequence to get the decline faster then any power of $j$ – Svyatoslav Oct 04 '24 at 14:53
  • @Svyatoslav I am not able to check your formula using Mathematica and I am very weak in complex analysis. Could you give some hints on how to compute the series using contour integrals? – blamethelag Oct 04 '24 at 17:07
  • @blamethelag Based on the work in my answer, your example $a_k=(-1)^k e^{-k}$ is guaranteed to have $A_j$ asymptotic to $-j^{-4}/(e+1)$. Presuming our derivations correct and that $\alpha_j,\beta_j$ be bounded as you say, then necessarily $\alpha_j+\beta_j\to 2e/(e+1)$. – Jade Vanadium Oct 06 '24 at 18:08
  • @Jade Vanadium That is why I take the sequence $a_k=\cos(b_1k)-\cos(b_2k)$, which provides it cancelling. Please read the comments above. – Svyatoslav Oct 06 '24 at 18:22
  • @Svyatoslav Apologies, you are correct. I also made time to verify your derivation, which is correct despite some typos (see my edit). Your example has $A_j$ achieving super-polynomial decay for appropriate selections of $b_1, b_2$. – Jade Vanadium Oct 06 '24 at 19:47
  • @Jade Vanadium Thank you for corrections! Typos is my real problem. – Svyatoslav Oct 07 '24 at 01:00

2 Answers2

2

Too long for a comment

Let's take $a>0$ and $b\in[0;2\pi]$. Then $$S(a,b)=\sum_{k=1}^\infty\frac{\cos(bk)}{k^2+a^2}=\frac12\sum_{k=-\infty;\,k\neq0}^\infty\frac{e^{ibk}}{k^2+a^2}=\frac12\left(\sum_{k=-\infty;}^\infty\frac{e^{ibk}}{k^2+a^2}-\frac1{a^2}\right)=\frac12\left(S_0-\frac1{a^2}\right)$$

To evaluate $S_0$ we integrate the function $\frac{2\pi i}{e^{2\pi iz}-1}\frac{e^{ibz}}{z^2+a^2}$ along the rectangular contour $\,C:\,-N-\frac12-iN\to N+\frac12-iN\to N+\frac12+iN\to-N-\frac12+iN\to-N-\frac12-iN$. Integral along this contour tends to zero as $N\to \infty$. On the other hand, $$\oint_C\frac{2\pi i}{e^{2\pi iz}-1}\frac{e^{ibz}}{z^2+a^2}dz=2\pi i\sum \operatorname{Res}\frac{2\pi i}{e^{2\pi iz}-1}\frac{e^{ibz}}{z^2+a^2}$$

The residues at the poles of the function $\frac{2\pi i}{e^{2\pi iz}-1}$ give us $S_0$; we also have two poles at $z=\pm ia$. Hence, $$S_0=-\underset {z=\pm ia}{\operatorname{Res}}\frac{2\pi i}{e^{2\pi iz}-1}\frac{e^{ibz}}{z^2+a^2}=\frac{2\pi i}{2ia}\left(\frac{e^{-ba}}{1-e^{-2\pi a}}+\frac{e^{ba}}{e^{2\pi a}-1}\right)=\frac\pi a\frac{\cosh(ba-\pi a)}{\sinh(\pi a)}$$ $$S=\frac\pi {2a}\frac{\cosh(ba-\pi a)}{\sinh(\pi a)}-\frac1{2a^2}$$

For the numeric check you can use WolframAlpha; for example, here and here. As for the book regarding complex integration I would recommend MATHEMATICAL METHODS FOR PHYSICISTS, by George B. Arfken and others, Chapter 11 Complex Variable Theory, 11.9 EVALUATION OF SUMS.

Good luck!

Jade Vanadium
  • 5,046
  • 10
  • 25
Svyatoslav
  • 20,502
  • Thank you for these explanations, the reference you gave is truely inspiring. However I am not sure to understand how the integral along $C$ vanishes for $N \rightarrow \infty$. In the reference they add the hypothesis $zf(z)$ goes to $0$ when $z$ goes to $\infty$, which is here not satisfied. Maybe this is why your result is not $2 \pi$ periodic in $b$, while I cannot see where you have used $0 \leq b \leq 2 \pi$ in your computations. – blamethelag Oct 09 '24 at 15:21
  • 1
    In the lower and upper half-planes, at $b\in[0;2\pi]$, $$|\frac1{e^{2\pi iz}-1}\frac{e^{ibz}}{z^2+a^2}|<|\frac{const}{z^2+a^2}|$$ And during the integration we avoid the "dangerous" points $z=0;\pm1;\pm2...$ – Svyatoslav Oct 09 '24 at 17:01
1

The asymptotic behavior of $A_j$ is strongly related to the Dirichlet Series with coefficients $a_k$, namely that $A_j$ decays quickly whenever the Dirichlet series $\sum_k a_k k^{-s}$ has many zeros at the points where $s$ is an even integer. Specifically, by requiring $a_k\to 0$ quickly, we can have $A_j = o(j^{-4-4N})$ simply by guaranteeing that the $a_k$ Dirichlet series has zeros whenever $s=-2n$ for $0\leq n\leq N$. This allows us to construct a huge family of $a_k$ sequences, having $A_j\to 0$ decaying very quickly, just by controlling those zeros.

The actual speed of convergence for $A_j\to 0$ seems to be otherwise independent from the asymptotic properties of the $a_k$ sequence, albeit it's easier to construct examples where we assume $a_k\to 0$ quickly. For an example where $A_j\to 0$ has super-polynomial decay, see Svyatoslav's answer.


The heart of this phenomenon comes from two basic facts. Firstly, for any integer $n\geq 0$ satisfying that $\sum_k a_k k^{2n}$ converges absolutely, the following must hold due to Dominated Convergence Theorem. $$\lim_{j\to\infty} j^4\cdot \sum_{k=1}^{\infty}\frac{a_k k^{2n}}{k^2+j^4} = \sum_{k=1}^{\infty} a_k k^{2n}\cdot \lim_{j\to\infty} \frac{j^4}{k^2+j^4} = \sum_{k=1}^{\infty} a_k k^{2n}$$

To actually use this to characterize $A_j$, we need to repeatedly deploy partial fraction decomposition on the expression $\frac{1}{k^2+j^4}$, starting with the fact $\frac{1}{k^2+j^4} = \frac{-k^2}{(k^2+j^4)j^4} + \frac{1}{j^4}$. Continuing with that inductively, we produce the following equality for any $N\geq 0$. $$\frac{1}{k^2+j^4} = \frac{(-k^2/j^4)^{N+1}}{k^2+j^4} + \sum_{n=0}^N (-k^2)^n j^{-4(n+1)}$$

Taking the assumption that $\sum_{k=1}^{\infty} a_k k^{2N}$ converges absolutely, we therefore have the following rearrangement. $$\begin{align} A_j &= \sum_{k=1}^{\infty} a_k \frac{1}{k^2+j^4} \\ &= \sum_{k=1}^\infty a_k \cdot\left(\frac{(-k^2/j^4)^N}{k^2+j^4} - \sum_{n=0}^{N-1} (-j^4)^{-n-1} k^{2n}\right) \\ &= \left(\sum_{k=1}^\infty a_k\frac{(-k^2/j^4)^N}{k^2+j^4}\right) - \sum_{n=0}^{N-1} (-j^4)^{-n-1} \sum_{k=1}^\infty a_k k^{2n} \\ &= o(j^{-4(N+1)}) - \sum_{n=0}^N (-j^4)^{-n-1} \sum_{k=1}^\infty a_k k^{2n} \end{align}$$

In the special case that the $a_k$ dirichlet series has zeros at the even integers $s=-2n$ for $0\leq n\leq N$, that entire righthand summation is zilch and we simply have $A_j$ approximated by the little-o term $o(j^{-4(N+1)})$. In light of that, we can force $A_j$ to decay very quickly by finding a convergent Dirichlet series with many zeros at the nonpositive even integers. In fact, if we can find a Dirichlet series with zeros at every nonpositive even integer, then the resulting $A_j$ sequence is guaranteed to decay faster than any (nonzero) polynomial/rational function.


Constructing functions obeying the above conditions is extremely easy, provided you only want finitely many zeros. To begin, arbitrarily select a bunch of fast-decaying linearly independent sequences $C_0,C_1,\cdots,C_{N+1}$, which must be fast-decaying in the sense that $C_{(\ell,k)}\cdot k^n \to 0$ in the limit as $k\to \infty$, for every $n$. A trivial example would be $C_{(\ell,k)} = \delta_{\ell,k}$ using the Kronecker delta, but almost anything works. To each $C_\ell$ we construct a vector $v_\ell$ given by $v_\ell[n] = \sum_{k=1}^{\infty} C_{(\ell,k)}k^{2n}$ for $n\in\{0,1,\cdots,N+1\}$. We have each $v_\ell\in\mathbb{R}^{N+2}$, and we have $N+2$ different vectors.

We will almost always find that the $v$ vectors are linearly independent, in which case they must form a basis for $\mathbb{R}^{N+2}$. In that case, there's a nontrivial linear combination $u = \sum_{\ell=0}^{N+1} x_\ell v_\ell$ where $u[n]=0$ for $n\neq N+1$ and $u[N+1]=1$. In the rare case they be dependent, we instead find a nontrivial linear combination where $u=\vec{0}$. In any case, we find a nontrivial linear combination where at least $u[n]=0$ for all $n\neq N+1$. Pull this back to the $C$ sequences, then we have a nontrivial linear combination $a = \sum_{\ell=0}^{N+1} x_\ell C_\ell$, and since the various $C$ sequences were assumed linearly independent, then this $a$ is not the zero sequence. We then observe the following fact, for each $n\in\{0,\cdots,N\}$. $$\begin{align} \sum_{k=1}^{\infty} a_k k^{2n} &= \sum_{k=1}^{\infty} \left(\sum_{\ell=0}^{N+1} x_\ell C_{(\ell,k)}\right) k^{2n} \\ &= \sum_{\ell=0}^{N+1} x_\ell \sum_{k=1}^{\infty} C_{(\ell,k)} k^{2n} \\ &= \sum_{\ell=0}^{N+1} x_\ell v_\ell[n] \\ &= u[n] \\ &= 0 \end{align}$$

The constructed $a$ sequence is fast-decaying, by merit of being a linear combination of the fast-decaying $C_\ell$ sequences, and is therefore subject to the work within the first half of my answer. Due to that and the above, it follows that the corresponding $A_j = o(j^{-4-4N})$, so we can have $A_j$ decaying to $0$ arbitrarily quickly within the polynomial-like hierarchy. The above linear algebra doesn't extend to having $A_j$ decay at a super-polynomial rate; likely we would need some much more difficult analysis to establish such functions. However, the sheer abundance of solutions is suggestive that there probably do exist many such cases.

Jade Vanadium
  • 5,046
  • 10
  • 25