1

Question:

Let $|a|<1$ and let $\left(x_k\right)_{k\ge 1}$ be a sequence that converges to zero. Define a sequence $(y_k)_{k\ge 0}$ given by the following relation

$y_k=x_k+ay_{k-1}$.

Determine whether $y_k\to 0$.

My attempt: It can be shown that $y_k=x_k+ax_{k-1}+a^2x_{k-2}+\dotsb+a^{k-1}x_1+a^ky_0$ for each $k\ge 1$. The first and last term of RHS goes to zero. But till now I am unable to estimate the terms $ax_{k-1}+a^2x_{k-2}+\dotsb+a^{k-1}x_1$ whether it converges to zero. Help is appreciated.

user149418
  • 2,476
  • Given $\varepsilon > 0$, there is an $M$ such that $\lvert x_m\rvert \leqslant \frac{1}{2}(1 - \lvert a\rvert)\varepsilon$ for $m \geqslant M$. Does that give you an idea? – Daniel Fischer Nov 11 '19 at 22:26
  • @ Daniel Fischer: I cannot settle the terms $x_m$ for $1\le m<M$. – user149418 Nov 11 '19 at 23:12
  • i have a tentative answer below, using also the fact that $x_k$ is bounded, in order to deal with the terms you are mentioning. – dfnu Nov 12 '19 at 00:00

1 Answers1

2

That's what I found so far. Do you think it may work?

As you wrote $$y_n = \sum_{k=0}^na^{n-k}x_k,$$ where $x_0=y_0$.

The sequence $(x_n)$ is bounded, so that $|x_n| < M$, for all $n$ and some positive $M$.

Fix some $\varepsilon >0$. Since $(x_n) \to 0$, for $k>N$ we have $$|x_k| < \varepsilon(1-|a|).$$

Then choose $n$ large enough (say $n>N_1$) so that $$|a^n|< \frac{\varepsilon |a^N|}{M(N+1)}$$

We can write, for $n>\max(N,N_1)$,

\begin{eqnarray} |y_n| &=& \left|\sum_{k=0}^{N}a^{n-k}x_k + \sum_{k=N+1}^na^{n-k}x_k\right|<\\ &<&M\sum_{k=0}^{N}\left|a^{n-k}\right| + \varepsilon (1-|a|) \sum_{k=N+1}^n\left| a^{n-k}\right|<\\ &<&M(N+1)|a^{n-N}|+\varepsilon<\\ &<&2\varepsilon. \end{eqnarray}

dfnu
  • 8,050
  • 1
    In the third-to-last line, you need $M\sum_{k = 0}^N \lvert a\rvert^{n-k}$ instead of your $M\Bigl\lvert \sum_{k = 0}^N a^{n-k}\Bigr\rvert$. Apart from that small mistake, this is the way. – Daniel Fischer Nov 12 '19 at 12:03
  • Is it correct that the second expression you wrote (that is the one in my answer) is less than the first one by triangular inequality?

    Also. A question arises (to me at least, that I know little). Suppose you have $$y_n = \sum_{k=0}^n a_{n-k}x_k,$$ where $(x_n)$ is a null sequence and $$\sum_n a_n$$ converges absolutely. Then $(y_n)$ is a null sequence, which can be proved using the same approach as above. But what happens if $$\sum_n a_n$$ converges non absolutely? The second term in the inequality above can be bounded again (Cauchy criterion). But what about the first one?

    – dfnu Nov 12 '19 at 12:18
  • Yes, the second is less than or equal to the first by the triangle inequality. If $a \geqslant 0$ they are equal, but for $a < 0$ (or non-real $a$, everything also works for complex sequences) the inequality is strict. If $\sum a_n$ converges only conditionally, then we can't say much. $(y_n)$ may be a null sequence (for example when $\sum x_n$ converges absolutely), but it may also diverge (consider $a_n = x_n = (-1)^n/\log n$ [for $n \geqslant 2$]). And also the second term cannot be bounded, because unless the sequences are specified you must take the absolute value of every single term. – Daniel Fischer Nov 12 '19 at 12:39
  • For every given $(a_n)$ and $N$ you can find $(x_n)$ such that $$\sum_{k = 0}^N a_{N-k}x_k = \sum_{k = 0}^N \lvert a_{N-k}x_k\rvert,.$$ (So you also need $$\varepsilon(1 - \lvert a\rvert) \sum_{k = N+1}^n \lvert a\rvert^{n-k}$$ in the third-to-last line, I overlooked that initially.) – Daniel Fischer Nov 12 '19 at 12:39
  • @DanielFischer, Ok, I'll just add an other line with the inequality, so that the consequentiality is clearer. I'll also analyze your examples. Thanks! – dfnu Nov 12 '19 at 12:42
  • The now fourth-to-last line needs to be removed, that expression can be smaller than $\lvert y_n\rvert$. Consider the case where $a < 0$ and $x_n$ has alternating sign. Then $$\lvert y_n\rvert = \sum_{k = 0}^n \lvert x_k\rvert \cdot \lvert a\rvert^{n-k}.$$ – Daniel Fischer Nov 12 '19 at 12:55
  • I request for one clarification. I don't understand why the inequality including $|a^n|$ holds for $n>N$, the same $N$ for which the inequality regarding $|x_k|$ is satisfied. I think this may not be true. To overcome this, suppose that for $n>N_1$, $|a^n|<\frac{\epsilon |a^N|}{M(N+1)}$. Then to use both these inequalities, we should choose $n> \max{N,N_1}=N_2$ (say). Then for $n>N_2$, how the inequality for $|y_n|$ given in the (first part) second to last line needs to be adjusted? – user149418 Nov 15 '19 at 20:17
  • @user149418, yes, of course. I wrote "choose $n$ large enough..." and I did not mean simply $n>N$, which only guarantees the bound on $|x_k|$, as you point out. – dfnu Nov 15 '19 at 20:20
  • please have a look at my comment. – user149418 Nov 15 '19 at 20:26
  • I was replying, but I am not sure I understand the second part of your comment correctly – dfnu Nov 15 '19 at 20:27
  • Once you set $N$, then you need to $n$ greater than some other $N_2$ as you say, to have $$|a^n|< \frac{\varepsilon |a^N|}{M(N+1)}.$$ So now you have that, for $n>N_2$, you can bound the first term in the second to last line as shown. There is no need for any adjustment, I believe. But let me know if you need to clarify more in the answer, of course! – dfnu Nov 15 '19 at 20:33
  • @user149418 I added the clarification you suggested, hope its clearer now. – dfnu Nov 15 '19 at 20:47
  • By the way, @user149418, I ended up writing the proof for the general case when you replace $a^n$ with any absolutely converging series. A few notes I took here https://www.dfnu.xyz/en/exercises-and-dialogues/non-sequitur/ I then found out it is a special case of Merten's Theorem on Cauchy product (see here https://en.wikipedia.org/wiki/Cauchy_product) – dfnu Nov 15 '19 at 20:51