The asymptotic behavior of $A_j$ is strongly related to the Dirichlet Series with coefficients $a_k$, namely that $A_j$ decays quickly whenever the Dirichlet series $\sum_k a_k k^{-s}$ has many zeros at the points where $s$ is an even integer. Specifically, by requiring $a_k\to 0$ quickly, we can have $A_j = o(j^{-4-4N})$ simply by guaranteeing that the $a_k$ Dirichlet series has zeros whenever $s=-2n$ for $0\leq n\leq N$. This allows us to construct a huge family of $a_k$ sequences, having $A_j\to 0$ decaying very quickly, just by controlling those zeros.
The actual speed of convergence for $A_j\to 0$ seems to be otherwise independent from the asymptotic properties of the $a_k$ sequence, albeit it's easier to construct examples where we assume $a_k\to 0$ quickly. For an example where $A_j\to 0$ has super-polynomial decay, see Svyatoslav's answer.
The heart of this phenomenon comes from two basic facts. Firstly, for any integer $n\geq 0$ satisfying that $\sum_k a_k k^{2n}$ converges absolutely, the following must hold due to Dominated Convergence Theorem.
$$\lim_{j\to\infty} j^4\cdot \sum_{k=1}^{\infty}\frac{a_k k^{2n}}{k^2+j^4} = \sum_{k=1}^{\infty} a_k k^{2n}\cdot \lim_{j\to\infty} \frac{j^4}{k^2+j^4} = \sum_{k=1}^{\infty} a_k k^{2n}$$
To actually use this to characterize $A_j$, we need to repeatedly deploy partial fraction decomposition on the expression $\frac{1}{k^2+j^4}$, starting with the fact $\frac{1}{k^2+j^4} = \frac{-k^2}{(k^2+j^4)j^4} + \frac{1}{j^4}$. Continuing with that inductively, we produce the following equality for any $N\geq 0$.
$$\frac{1}{k^2+j^4} = \frac{(-k^2/j^4)^{N+1}}{k^2+j^4} + \sum_{n=0}^N (-k^2)^n j^{-4(n+1)}$$
Taking the assumption that $\sum_{k=1}^{\infty} a_k k^{2N}$ converges absolutely, we therefore have the following rearrangement.
$$\begin{align}
A_j &= \sum_{k=1}^{\infty} a_k \frac{1}{k^2+j^4} \\
&= \sum_{k=1}^\infty a_k \cdot\left(\frac{(-k^2/j^4)^N}{k^2+j^4} - \sum_{n=0}^{N-1} (-j^4)^{-n-1} k^{2n}\right) \\
&= \left(\sum_{k=1}^\infty a_k\frac{(-k^2/j^4)^N}{k^2+j^4}\right) - \sum_{n=0}^{N-1} (-j^4)^{-n-1} \sum_{k=1}^\infty a_k k^{2n} \\
&= o(j^{-4(N+1)}) - \sum_{n=0}^N (-j^4)^{-n-1} \sum_{k=1}^\infty a_k k^{2n}
\end{align}$$
In the special case that the $a_k$ dirichlet series has zeros at the even integers $s=-2n$ for $0\leq n\leq N$, that entire righthand summation is zilch and we simply have $A_j$ approximated by the little-o term $o(j^{-4(N+1)})$. In light of that, we can force $A_j$ to decay very quickly by finding a convergent Dirichlet series with many zeros at the nonpositive even integers. In fact, if we can find a Dirichlet series with zeros at every nonpositive even integer, then the resulting $A_j$ sequence is guaranteed to decay faster than any (nonzero) polynomial/rational function.
Constructing functions obeying the above conditions is extremely easy, provided you only want finitely many zeros. To begin, arbitrarily select a bunch of fast-decaying linearly independent sequences $C_0,C_1,\cdots,C_{N+1}$, which must be fast-decaying in the sense that $C_{(\ell,k)}\cdot k^n \to 0$ in the limit as $k\to \infty$, for every $n$. A trivial example would be $C_{(\ell,k)} = \delta_{\ell,k}$ using the Kronecker delta, but almost anything works. To each $C_\ell$ we construct a vector $v_\ell$ given by $v_\ell[n] = \sum_{k=1}^{\infty} C_{(\ell,k)}k^{2n}$ for $n\in\{0,1,\cdots,N+1\}$. We have each $v_\ell\in\mathbb{R}^{N+2}$, and we have $N+2$ different vectors.
We will almost always find that the $v$ vectors are linearly independent, in which case they must form a basis for $\mathbb{R}^{N+2}$. In that case, there's a nontrivial linear combination $u = \sum_{\ell=0}^{N+1} x_\ell v_\ell$ where $u[n]=0$ for $n\neq N+1$ and $u[N+1]=1$. In the rare case they be dependent, we instead find a nontrivial linear combination where $u=\vec{0}$. In any case, we find a nontrivial linear combination where at least $u[n]=0$ for all $n\neq N+1$. Pull this back to the $C$ sequences, then we have a nontrivial linear combination $a = \sum_{\ell=0}^{N+1} x_\ell C_\ell$, and since the various $C$ sequences were assumed linearly independent, then this $a$ is not the zero sequence. We then observe the following fact, for each $n\in\{0,\cdots,N\}$.
$$\begin{align}
\sum_{k=1}^{\infty} a_k k^{2n} &= \sum_{k=1}^{\infty} \left(\sum_{\ell=0}^{N+1} x_\ell C_{(\ell,k)}\right) k^{2n} \\
&= \sum_{\ell=0}^{N+1} x_\ell \sum_{k=1}^{\infty} C_{(\ell,k)} k^{2n} \\
&= \sum_{\ell=0}^{N+1} x_\ell v_\ell[n] \\
&= u[n] \\
&= 0
\end{align}$$
The constructed $a$ sequence is fast-decaying, by merit of being a linear combination of the fast-decaying $C_\ell$ sequences, and is therefore subject to the work within the first half of my answer. Due to that and the above, it follows that the corresponding $A_j = o(j^{-4-4N})$, so we can have $A_j$ decaying to $0$ arbitrarily quickly within the polynomial-like hierarchy. The above linear algebra doesn't extend to having $A_j$ decay at a super-polynomial rate; likely we would need some much more difficult analysis to establish such functions. However, the sheer abundance of solutions is suggestive that there probably do exist many such cases.