5

I would love some feedback on this proof. I've spent forever working on it. At this point my mind is too jumbled to realize if I can make it more efficient. I feel confident in it, but I will note the areas I felt less sure about. Any feedback would be appreciated!

Suppose $X$ is a Banach space, and $(x_{mn})$ is a doubly indexed sequence in $X$ such that $$\sum_{m=1}^{\infty}\sum_{n=1}^{\infty}||x_{mn}||<\infty.$$ Prove that $$\sum_{m=1}^{\infty}\Bigg(\sum_{n=1}^{\infty}x_{mn}\Bigg)=\sum_{n=1}^{\infty}\Bigg(\sum_{m=1}^{\infty}x_{mn}\Big).$$

We first note that $$\sum_{m=1}^{\infty}\sum_{n=1}^{\infty}||x_{mn}||=\sum_{m=1}^{\infty}\Bigg(\sum_{n=1}^{\infty}||x_{mn}||\Bigg)=\sum_{n=1}^{\infty}\Bigg(\sum_{m=1}^{\infty}||x_{mn}||\Bigg),$$ which implies that \begin{align} \sum_{m=1}^M\Bigg(\sum_{n=1}^{\infty}||x_{mn}||\Bigg)&<\infty,\hspace{5 mm}\forall M\in\mathbb{N}\hspace{5 mm}(1)\\ \sum_{n=1}^N\Bigg(\sum_{m=1}^{\infty}||x_{mn}||\Bigg)&<\infty,\hspace{5 mm}\forall N\in\mathbb{N}\hspace{5 mm}(2) \end{align}

Then, for any $\epsilon>0$ we may find a $K$ so that \begin{align*} \bigg|\sum_{m=1}^{m_1}\bigg(\sum_{n=1}^{\infty}||x_{mn}||\bigg)-\sum_{m=1}^{m_2}\bigg(\sum_{n=1}^{\infty}||x_{mn}||\bigg)\bigg|&<\frac{\epsilon}{2}\\ \bigg|\sum_{n=1}^{n_1}\bigg(\sum_{m=1}^{\infty}||x_{mn}||\bigg)-\sum_{n=1}^{n_2}\bigg(\sum_{m=1}^{\infty}||x_{mn}||\bigg)\bigg|&<\frac{\epsilon}{2}. \end{align*} whenever $m_1,m_2,n_1,n_2>K$. We may use this to show that $\sum_{m=1}^M\sum_{n=1}^Nx_{mn}$ is Cauchy for any $M,N>K$. Without loss of generality, there are two cases to consider. Either $m_1>m_2$ and $n_1>n_2$ or $m_1>m_2$ and $n_2>n_1$. In the first case we see that \begin{align*} \bigg|\bigg|\sum_{m=1}^{m_1}\sum_{n=1}^{n_1}x_{mn}-\sum_{m=1}^{m_2}\sum_{n=1}^{n_2}x_{mn}\bigg|\bigg|&=\bigg|\bigg|\sum_{m=m_2+1}^{m_1}\sum_{n=1}^{n_1}x_{mn}+\sum_{m=1}^{m_1}\sum_{n=n_2+1}^{n_1}\bigg|\bigg|\\ &\leq \sum_{m=m_2+1}^{m_1}\sum_{n=1}^{n_1}||x_{mn}||+\sum_{m=1}^{m_1}\sum_{n=n_2+1}^{n_1}||x_{mn}||\\ &\leq\sum_{m=m_2+1}^{m_1}\sum_{n=1}^{\infty}||x_{mn}||+\sum_{n=n_2+1}^{n_1}\sum_{m=1}^{\infty}||x_{mn}||\\ &<\epsilon. \end{align*} While the second case gives us \begin{align*} \bigg|\bigg|\sum_{m=1}^{m_1}\sum_{n=1}^{n_1}x_{mn}-\sum_{m=1}^{m_2}\sum_{n=1}^{n_2}x_{mn}\bigg|\bigg|&=\bigg|\bigg|\sum_{m=m_2+1}^{m_1}\sum_{n=1}^{n_1}x_{mn}-\sum_{m=1}^{m_1}\sum_{n=n_1+2}^{n_2}x_{mn}\bigg|\bigg|\\ &\leq \sum_{m=m_2+1}^{m_1}\sum_{n=1}^{n_1}||x_{mn}||+\sum_{n=n_1+2}^{n_2}\sum_{m=1}^{m_1}||x_{mn}||\\ &\leq \sum_{m=m_2+1}^{m_1}\sum_{n=1}^{\infty}||x_{mn}||+||\sum_{n=n_1+2}^{n_2}\sum_{m=1}^{\infty}||x_{mn}||\\ &<\epsilon. \end{align*} Since this inequality arises with $m_1,m_2,n_1,n_2$ chosen independently, we are able to take their limits independently in order to arrive at the desired equality. This is made rigorous by the following argument (the following is an area of relative uncertainty for me. I believe it is valid but was somewhat not confident). Given $\epsilon>0$ we find $K_1$ so that $$\bigg|\bigg|\sum_{m=1}^{m_1}\sum_{n=1}^{n_1}x_{mn}-\sum_{m=1}^{m_2}\sum_{n=1}^{n_2}x_{mn}\bigg|\bigg|<\frac{\epsilon}{2}$$ whenever $m_1,m_2,n_1,n_2>K_1$. We then find $K_2$ such that $$\bigg|\bigg|\sum_{m=1}^{m_2}\sum_{n=1}^{\infty}x_{mn}-\sum_{m=1}^{m_2}\sum_{n=1}^{n_2}x_{mn}\bigg|\bigg|<\frac{\epsilon}{2}$$ whenever $n_2>K_2$, which is guaranteed by $(1)$. Then, letting $K>\max(K_1,K_2)$, it follows that \begin{align*} \bigg|\bigg|\sum_{m=1}^{m_1}\sum_{n=1}^{n_1}x_{mn}-\sum_{m=1}^{m_2}\sum_{n=1}^{\infty}x_{mn}\bigg|\bigg|&\leq \bigg|\bigg|\sum_{m=1}^{m_1}\sum_{n=1}^{n_1}x_{mn}-\sum_{m=1}^{m_2}\sum_{n=1}^{n_2}x_{mn}\bigg|\bigg|\\ &+\bigg|\bigg|\sum_{m=1}^{m_2}\sum_{n=1}^{\infty}x_{mn}-\sum_{m=1}^{m_2}\sum_{n=1}^{n_2}x_{mn}\bigg|\bigg|\\ <\epsilon \end{align*} whenever $m_1,m_2,n_1>K$. We may repeat this process to arrive at $$\bigg|\bigg|\sum_{m=1}^{\infty}\bigg(\sum_{n=1}^{\infty}x_{mn}\bigg)-\sum_{n=1}^{\infty}\bigg(\sum_{m=1}^{\infty}x_{mn}\bigg)\bigg|\bigg|<\epsilon$$ for all $\epsilon$.

c_gnar
  • 423

1 Answers1

3

To be clear at the onset, the convergence of a double series $\sum_{m,n} x_{mn} = \sum_m \sum_n x_{mn}$ means there exists $S$ such that for any $\epsilon > 0$, there exists $N(\epsilon) \in\mathbb{N}$ such that $\|\sum_{m=1}^M\sum_{n=1}^N x_{mn} - S\| < \epsilon$ for all $m, n > N(\epsilon)$. The convergence of a double series does not necessarily follow from the convergence of an iterated series $\sum_{m=1}^\infty\left(\sum_{n=1}^\infty x_{mn}\right)$ and vice versa.

With this definition, it is true that convergence of the double series $\sum_{m=1}^\infty\sum_{n=1}^\infty \|x_{mn}\|$ implies convergence of the double series $\sum_{m=1}^\infty\sum_{n=1}^\infty x_{mn}$ and equality of the iterated series

$$\sum_{m=1}^\infty\sum_{n=1}^\infty x_{mn} = \sum_{m=1}^\infty\left(\sum_{n=1}^\infty x_{mn}\right) = \sum_{n=1}^\infty\left(\sum_{m=1}^\infty x_{mn}\right)$$

This can be proved in a series of steps, some of which (but not all) you did correctly and/or clearly.

(1) As you stated but did not prove,

$$\sum_{m=1}^\infty\sum_{n=1}^\infty \|x_{mn}\| = \sum_{m=1}^\infty\left(\sum_{n=1}^\infty \|x_{mn}\|\right) = \sum_{n=1}^\infty\left(\sum_{m=1}^\infty \|x_{mn}\|\right),$$

holds when the double series on the LHS converges. This follows from the nonnegativity of $\|x_{mn}\|$ and a straightforward monotone convergence argument.

(2) It then follows as you correctly showed that the double series $\sum_{m,n} x_{mn}$ is convergent by the Cauchy criterion, with

$$\sum_{m=1}^\infty\sum_{n=1}^\infty x_{mn} = S$$

(3) The series $\sum_{n=1}^\infty x_{mn}$ and $\sum_{m=1}^\infty x_{mn}$ converge for each $m$ and for each $n$, respectively. This follows again by a Cauchy argument using, for example,

$$\left\|\sum_{n= n_1+1}^{n_2} x_{mn} \right\| \leqslant \sum_{n= n_1+1}^{n_2} \|x_{mn}\|$$

We can finish (easily) as follows. From convergence established in (2), for any $\epsilon > 0$, there exists $K(\epsilon) \in \mathbb{N}$ such that for all $M,N > K(\epsilon)$,

$$\|S_{MN} - S\| := \left\|\sum_{m=1}^M\sum_{n=1}^N x_{mn} -S\right\| < \epsilon $$

From (3) there exists $T_M$ for all $M > K(\epsilon)$ such that

$$T_M = \lim_{N \to \infty}S_{MN} = \sum_{m=1}^M\left(\sum_{n=1}^\infty x_{mn}\right)$$

Finally, we have for all $M > K(\epsilon)$,

$$\left\|\sum_{m=1}^M\left(\sum_{n=1}^\infty x_{mn}\right) -S\right\|= \|\lim_{N \to \infty} S_{MN} - S \| = \lim_{N \to \infty} \|S_{MN} -S\| \leqslant \epsilon,$$

which implies that

$$\sum_{m=1}^\infty\left(\sum_{n=1}^\infty x_{mn}\right) =S = \sum_{m=1}^\infty\sum_{n=1}^\infty x_{mn} $$

The same result can be proved for the other iterated series in a similar way.


The key step in the last part of the proof is the following general result. For any convergent sequence $a_n$ in a Banach space such that $\|a_n - C\| < \epsilon$ for all sufficiently large $n$, it follows that $\|\lim_{n \to \infty}a_n - C\|= \lim_{n \to \infty}\|a_n - C\|\leqslant \epsilon$.

For proof, let $\lim_{n \to \infty} a_n = L$. By the reverse triangle inequality,

$$|\, \|a_n - C\| - \|L - C\|\, |\leqslant \|(a_n - C)- (L-C) \| = \|a_n - L\| \underset{n \to \infty}\longrightarrow 0$$

Hence, $\lim_{n \to \infty} \|a_n -C\| = \|L-c\| = \|\lim_{n\to \infty}a_n -C\|$.

Given that $\|a_n - C\| < \epsilon$ for all $n > N_1$, assume that $\|L-C\| > \epsilon$. Take $\eta = (\|L-C\|+\epsilon)/2$. Since $\epsilon < \eta < \|L-C\|$, there exists $N_2$ such that for all $n > N_2$ we have

$$\|a_n - C\| > \eta > \epsilon,$$

which is a contradiction for $n > \max (N_1,N_2)$.

Therefore, $\|\lim_{n \to \infty} a_n - c\|= \lim_{n \to \infty}\|a_n - C\| \leqslant \epsilon$.

RRL
  • 92,835
  • 7
  • 70
  • 142
  • Thanks RRL, this is helpful. Your comment about K_2 makes sense. I felt funny about it and now I know why. A few comments/questions. 1) I don't see how the convergence of a double series does not imply the convergence of the iterated series. 2) I'm not actually seeing how to use the triangle inequality to show that last bit that you mention. Do you mind giving me a jump start there? 3) can you recommend a resource to understand some of these nuances of double/iterated series? – c_gnar Mar 28 '21 at 22:16
  • Would it be: $||a_n||<\epsilon+C$, and use $\lim_{n\to\infty}||a_n||=||\lim_{n\to\infty}a_n||$? – c_gnar Mar 28 '21 at 22:34
  • @c_gnar: You're welcome. (2) I finished off the proof for you. (3) Double sequences /series may converge without convergence of the iterated sequences / series. As an example consider the double sequence $x_{mn} = (-1)^{m+n} \left(\frac{1}{m} + \frac{1}{n} \right)$. Here the double limit is $0$, but $x_{mn}$ oscillates between $\pm \frac{(-1)^m}{m}$ as $n \to \infty$ and $\lim_{n \to \infty} x_{mn}$ DNE. – RRL Mar 28 '21 at 23:53
  • The convergence of a double series with nonnegative terms does imply the convergence and equality of the iterated series -- and , in fact, we used that above. It is in the case of conditional convergence when the terms oscillate in sign that this is not guaranteed. – RRL Mar 28 '21 at 23:57
  • (3) The best resources for the topic are good analysis books written long ago where the authors bothered to write about this interesting topic. Elements of Real Analysis (2nd. edition) by Bartle, The Theory of Functions by Titchmarsh, An Introduction to the Theory of Infinite Series by Bromwich, etc. – RRL Mar 29 '21 at 00:02
  • Awesome thanks! It seems sometimes one has to dig through history to get some basic questions answered! Ok, so I see where you are coming from (I believe). I have two questions 1) Is the following correct? if $\sum x_{mn}$ converges and if one of the iterated sums converges, then they are equal. I ask because, while we use absolute convergence to show that the iterated sums converge, we don't seem to require absolute converges in showing they converge to the double sum specifically. 2) Can I use the same limit as you, but for the Cauchy definition that I had before? – c_gnar Mar 29 '21 at 20:14
  • The proof that you graciously provided me (the one showing $||L-C||<\epsilon$) seems to work equally well for $||S_{m_1n_1}-S_{m_2n_2}||$, where $S_{mn}$ is the partial sum. Just taking the limits with respect to $n_i$ indepentently. – c_gnar Mar 29 '21 at 20:15
  • RRL, first off, I thoroughly enjoyed reading Hyslop's section on double series. It offered some major clarifications. Thanks for the references. Also, in case you are interested, I wanted to mention to you that I believe my $K_2$ argument is valid after all (albeit painful, I agree). The reason is, while I use $m_2$ to find $K_2$, only $n_2>K_2$. $m_1,m_2,n_1$ need only be greater than $K_1$, and $n_2$ is absent from the final inequality. Unfortunately I don't have the space in comments to provide a complete argument. – c_gnar Mar 30 '21 at 16:01
  • @c_gnar: OK, I'll look at it again. I can see roughly why your argument is correct. As for your follow-up question, think of the double series as a double sequence of partial sums $S_{mn}$. If the double sequence converges and there exists $\lim_{n} S_{mn} $ for all $m$, then $\lim_{m,n} S_{mn} = \lim_m \lim_n S_{mn}$. – RRL Mar 30 '21 at 16:25
  • @RRL I have a question: Is there a counterexample where the iterated series $\sum_{n=1}^\infty \sum_{n=1}^\infty x_{n,k}$ with $x_{m,k} \geq 0$ converges, but the swapped iterated double sum $\sum_{k=1}^\infty \sum_{n=1}^\infty x_{n,k}$ does not? – azimut Jan 17 '24 at 20:28
  • @azimut: If all the terms $x_{nk}$ are nonnegative, then the double series and both iterated series either all converge (and are equal) or all diverge. If you allow the terms to change sign, then you can find examples where the iterated series do not converge to the same value or where one converges and the other does not. – RRL Jan 17 '24 at 23:03
  • @RRL ah thanks. good to know. – azimut Jan 17 '24 at 23:15