8

Let us consider two sequences $(a_n)$ and $(b_n)$ of nonnegative real numbers, such that : $$\sum_{n=0}^{+\infty} a_n=1$$ $$\sum_{n=0}^{+\infty} n\,a_n=+\infty$$ $$\sum_{n=0}^{+\infty} b_n<+\infty$$ and define a sequence $(u_n)$ by the recurrence formula : $$\forall n\in\mathbb{N},\,u_n=b_n+\sum_{k=0}^na_{n-k}\,u_k$$

The sequence $(u_n)$ is well defined and has nonnegative terms.

I know how to prove that :

  • $(u_n)$ is bounded (see below)
  • if $(u_n)$ converges, then $\displaystyle{\lim_{n\to+\infty}u_n=0}$

My first question : I am looking for an explicit example where the sequence $(u_n)$ diverges.

My second question : How to prove that if $a_1>0$ then $\displaystyle{\lim_{n\to\infty}u_n}=0$ ?


Proof that $(u_n)$ has to be bounded :

For all $n\in\mathbb{N}$, let $M_n=\max\{u_k;0\le k\le n\}$.

It is necessary that $a_0<1$, since otherwise we would have $a_n=0$ for all $n>0$ and the series $\sum_{n\ge0}n\,a_n$ would converge.

If $n\ge1$ : $$u_n=\frac{1}{1-a_0}\left(b_n+\sum_{k=0}^{n-1}a_{n-k}u_k\right)\le\frac{1}{1-a_0}\left(b_n+M_{n-1}\sum_{k=1}^n a_k\right)=\frac{1}{1-a_0}\left(b_n+(1-a_0)M_{n-1}\right)$$

Hence :

$$u_n\le\frac{b_n}{1-a_0}+M_{n-1}$$

Since the inequality $M_{n-1}\le\frac{b_n}{1-a_0}+M_{n-1}$ is also true, we conclude that :

$$M_n\le\frac{b_n}{1-a_0}+M_{n-1}$$

After summation, we get :

$$M_n\le M_0+\frac{1}{1-a_0}\sum_{k=1}^nb_k\le u_0+\frac{1}{1-a_0}\sum_{k=1}^{+\infty}b_k$$

Adren
  • 8,184

2 Answers2

2

It's not a complete answer, but it shows that the sequence $b$ is irrelevant. Maybe somebody can continue...

When $u, v \in \mathbb{R}^{\mathbb{N}}$, their convolution product $u \ast v \in \mathbb{R}^{\mathbb{N}}$ is defined by:
$$ \forall n\in \mathbb{N},\quad (u\ast v)_n=\sum_{k=0}^{n} u_k v_{n-k} $$ It is easy to show that the operation $\ast$ is commutative and distributive with respect to addition in $\mathbb{R}^{\mathbb{N}}$.

We also see that the sequence $e$ defined by $e_n = \delta_{0, n}$ satisfies:
$$ \forall u\in \mathbb{R}^{\mathbb{N}},\quad e\ast u=u\ast e=u $$ Equipped with the two operations $+$ and $\ast$, $\mathbb{R}^{\mathbb{N}}$ forms a commutative ring.

The relation:
$$ \forall n\in \mathbb{N},\quad u_n=b_n+\sum_{k=0}^{n} a_k u_{n-k} $$ can be rewritten as $u = b + a \ast u$, or equivalently,
$$ (e - a) \ast u = b. $$ If the sequence $e - a$ is invertible, then we obtain:
$$ u = (e - a)^{-1} \ast b. $$ But this is indeed the case since there exists a unique bounded sequence $a'$ satisfying:
$$ a' = e + a \ast a', \quad \text{or equivalently,} \quad (e - a) \ast a' = e. $$ It remains to show that under the assumption $a_1 > 0$, the sequence $a' = (e - a)^{-1}$ converges to $0$. Then, the sequence $u = a' \ast b$ will also converge to $0$.

Marcin
  • 69
2

Before delving into the answer, we reformulate the problem in terms of generating function. Define $A(x)$, $B(x)$, and $U(x)$ by the generating functions of $(a_n)$, $(b_n)$, and $(u_n)$, respectively:

$$ A(x) = \sum_{n=0}^{\infty} a_n x^n, \qquad B(x) = \sum_{n=0}^{\infty} b_n x^n, \qquad U(x) = \sum_{n=0}^{\infty} u_n x^n. $$

Then the assumptions are equivalent to:

$$ A(1) = 1, \qquad A'(1) = \infty, \qquad B(1) < \infty. \tag{1}\label{e:cond} $$

Also, the recurrence relation is equivalent to:

$$ U(x) = \frac{B(x)}{1 - A(x)} = B(x) \sum_{n=0}^{\infty} A(x)^n. \tag{2}\label{e:gen} $$

Step 1. Simplifications

  1. As OP has already observed, $\eqref{e:cond}$ implies that we must have $a_0 < 1$.

  2. If $b_n = 0$ identically, then $u_n = 0$ for all $n \geq 0$ and there is nothing interesting here. So, it suffices to consider the case $B(1) > 0$, and we do so.

  3. Rearranging the recurrence relation, we get $ u_n = \frac{b_n}{1 - a_0} + \sum_{k=0}^{n-1} \frac{a_{n-k}}{1 - a_0} u_k $. This shows that we may assume $a_0 = 0$.

  4. Dividing both sides of the recurrence relation by $B(1)$, we may assume that $B(1) = 1$.

Summarizing, we may assume with losing the generality that

Assumption. $a_0 = 0$ and $\sum_{n=0}^{\infty} b_n = 1$.

Step 2. Probabilistic Interpretation

Let $\xi, X_1, X_2, \ldots$ be independent random variables such that

$$ \mathbb{P}(\xi = i) = b_i \qquad\text{and} \qquad \mathbb{P}(X_k = i) = a_i. $$

Then $A(x) = \mathbb{E}[x^{X_i}]$ and $B(x) = \mathbb{E}[x^{\xi}]$, and so, the relation $\eqref{e:gen}$ can be rephrased as:

$$ U(x) = \mathbb{E}\left[ \sum_{k=0}^{\infty} x^{\xi + X_1 + \cdots + X_k} \right]. $$

Since $a_0 = 0$, we know that $S_k = \xi + X_1 + \cdots + X_k$ is strictly increasing. Consequently, for each $n$, there is at most one $k$ for which $S_k = n$ holds. This leads to the following probabilistic interpretation of $u_n$:

$$ u_n = \mathbb{P}(S_k = n \text{ for some } k \geq 0). $$

Then by the Blackwell's renewal theorem, it follows that:

Theorem. $u_n$ converges to $0$ as $ n \to \infty$.

In other words, there is no counter-example to OP's question! Here, we outline a self-contained proof of the above theorem.

Proof. We first assume $\xi = 0$, or equivalently, $B(x) = 1$. Let $g$ be the span of the distribution $(a_n)$, meaning that $g$ is the GCD of all $n$'s for which $a_n \neq 0$. Also, for each $n \geq 0$, define $N_n$ by

$$ N_n = \min \{ k \geq 0 : S_k \geq gn \}, $$

and then define the process $(Y_n)_{n\geq 0}$ by

$$ Y_n = S_{N_n} - gn. $$

Claim. $(Y_n)$ is an irreducible aperiodic Markov chain on $g \mathbb{Z}_{\geq 0}$.

Proof of Claim.

  • If $Y_n \geq 1$, then $S_{N_n} > gn$ and hence $N_{n+1} = N_n$. This gives $Y_{n+1} = Y_n - g$.

  • Now suppose $Y_n = 0$. Conditioned on this event, we have $S_{N_n} = gn < g(n+1)$, and hence $N_{n+1} = N_n + 1$. Moreover, for each $k \geq 0$, the event $N_n = k$ and $X_{k+1}$ are conditionally independent given $Y_n = 0$. Consequently, $Y_{n+1} = X_{N_n + 1} - g$ is independent of $Y_n$ given $Y_n = 0$ and is identically distributed as $X_1 - g$.

  • Since the support of $X_1$ is unbounded, $(Y_n)$ can visit any site of $g\mathbb{Z}_{\geq 0}$. Clearly, $(Y_n)$ is irreducible by its design. Moreover, by our choice of $g$, $(Y_n)$ is aperiodic.

Now the key observation is the following equality:

$$ u_{gn} = \mathbb{P}(Y_n = 0). $$

A standard result in Markov chains shows that $\mathbb{P}(Y_n = 0) \to \frac{1}{\mathbb{E}[X_1]} = 0$ as $n \to \infty$. Since $u_n = 0$ for $g \nmid n$, it therefore follows that $u_n \to 0$ as $n \to \infty$.

For the general distribution $\xi$, the relation

$$ u_n = \sum_{k=0}^{n} b_k \mathbb{P}(X_1 + \cdots + X_j = n-k \text{ for some } j \geq 0) $$

can be used to show that we still have $u_n \to 0$ as $n \to \infty$.

Sangchul Lee
  • 181,930