Before delving into the answer, we reformulate the problem in terms of generating function. Define $A(x)$, $B(x)$, and $U(x)$ by the generating functions of $(a_n)$, $(b_n)$, and $(u_n)$, respectively:
$$ A(x) = \sum_{n=0}^{\infty} a_n x^n, \qquad
B(x) = \sum_{n=0}^{\infty} b_n x^n, \qquad
U(x) = \sum_{n=0}^{\infty} u_n x^n. $$
Then the assumptions are equivalent to:
$$ A(1) = 1, \qquad A'(1) = \infty, \qquad B(1) < \infty. \tag{1}\label{e:cond} $$
Also, the recurrence relation is equivalent to:
$$ U(x) = \frac{B(x)}{1 - A(x)} = B(x) \sum_{n=0}^{\infty} A(x)^n. \tag{2}\label{e:gen} $$
Step 1. Simplifications
As OP has already observed, $\eqref{e:cond}$ implies that we must have $a_0 < 1$.
If $b_n = 0$ identically, then $u_n = 0$ for all $n \geq 0$ and there is nothing interesting here. So, it suffices to consider the case $B(1) > 0$, and we do so.
Rearranging the recurrence relation, we get $ u_n = \frac{b_n}{1 - a_0} + \sum_{k=0}^{n-1} \frac{a_{n-k}}{1 - a_0} u_k $. This shows that we may assume $a_0 = 0$.
Dividing both sides of the recurrence relation by $B(1)$, we may assume that $B(1) = 1$.
Summarizing, we may assume with losing the generality that
Assumption. $a_0 = 0$ and $\sum_{n=0}^{\infty} b_n = 1$.
Step 2. Probabilistic Interpretation
Let $\xi, X_1, X_2, \ldots$ be independent random variables such that
$$ \mathbb{P}(\xi = i) = b_i \qquad\text{and} \qquad \mathbb{P}(X_k = i) = a_i. $$
Then $A(x) = \mathbb{E}[x^{X_i}]$ and $B(x) = \mathbb{E}[x^{\xi}]$, and so, the relation $\eqref{e:gen}$ can be rephrased as:
$$ U(x) = \mathbb{E}\left[ \sum_{k=0}^{\infty} x^{\xi + X_1 + \cdots + X_k} \right]. $$
Since $a_0 = 0$, we know that $S_k = \xi + X_1 + \cdots + X_k$ is strictly increasing. Consequently, for each $n$, there is at most one $k$ for which $S_k = n$ holds. This leads to the following probabilistic interpretation of $u_n$:
$$ u_n = \mathbb{P}(S_k = n \text{ for some } k \geq 0). $$
Then by the Blackwell's renewal theorem, it follows that:
Theorem. $u_n$ converges to $0$ as $ n \to \infty$.
In other words, there is no counter-example to OP's question! Here, we outline a self-contained proof of the above theorem.
Proof. We first assume $\xi = 0$, or equivalently, $B(x) = 1$. Let $g$ be the span of the distribution $(a_n)$, meaning that $g$ is the GCD of all $n$'s for which $a_n \neq 0$. Also, for each $n \geq 0$, define $N_n$ by
$$ N_n = \min \{ k \geq 0 : S_k \geq gn \}, $$
and then define the process $(Y_n)_{n\geq 0}$ by
$$ Y_n = S_{N_n} - gn. $$
Claim. $(Y_n)$ is an irreducible aperiodic Markov chain on $g \mathbb{Z}_{\geq 0}$.
Proof of Claim.
If $Y_n \geq 1$, then $S_{N_n} > gn$ and hence $N_{n+1} = N_n$. This gives $Y_{n+1} = Y_n - g$.
Now suppose $Y_n = 0$. Conditioned on this event, we have $S_{N_n} = gn < g(n+1)$, and hence $N_{n+1} = N_n + 1$. Moreover, for each $k \geq 0$, the event $N_n = k$ and $X_{k+1}$ are conditionally independent given $Y_n = 0$. Consequently, $Y_{n+1} = X_{N_n + 1} - g$ is independent of $Y_n$ given $Y_n = 0$ and is identically distributed as $X_1 - g$.
Since the support of $X_1$ is unbounded, $(Y_n)$ can visit any site of $g\mathbb{Z}_{\geq 0}$. Clearly, $(Y_n)$ is irreducible by its design. Moreover, by our choice of $g$, $(Y_n)$ is aperiodic.
Now the key observation is the following equality:
$$ u_{gn} = \mathbb{P}(Y_n = 0). $$
A standard result in Markov chains shows that $\mathbb{P}(Y_n = 0) \to \frac{1}{\mathbb{E}[X_1]} = 0$ as $n \to \infty$. Since $u_n = 0$ for $g \nmid n$, it therefore follows that $u_n \to 0$ as $n \to \infty$.
For the general distribution $\xi$, the relation
$$ u_n = \sum_{k=0}^{n} b_k \mathbb{P}(X_1 + \cdots + X_j = n-k \text{ for some } j \geq 0) $$
can be used to show that we still have $u_n \to 0$ as $n \to \infty$.