1

Let $T$ be a bounded linear operator on a Banach space. We have following formula for the resolvent:

$$\frac{1}{\lambda I-T}=\frac{1}{\lambda }\frac{1}{I-\frac{T}{\lambda }}=\frac{1}{\lambda }\left(I+\frac{T}{\lambda }+\left(\frac{T}{\lambda }\right)^2\dots \right).$$

By Gelfand's formula, this series converges absolutely for $|\lambda|>r(T)$, where $r(T)$ is the spectral radius $\lim _{n\to \infty }\ \left|\left|T^n\right|\right|^{\frac{1}{n}}.$

I am interested in applying this formula at $|\lambda|=r(T)$. As motivation, let's take $T$ to be the right-shift operator $T: \ell^2 \rightarrow \ell^2$. Then $r(T)=1$. Applying the above power series, we find $$\frac{1}{I-T}=I+T+T^2\dots.$$ This is interesting because while applying the RHS does not converge in the operator norm, the terms $T^n$ are weakly operator convergent to zero. Furthermore, I think I showed that if given a vector $v$, there exists a $w$ satisfying $(1-T)w=v$, then $$v+Tv+T^2v\dots =w\ \ \ \ \ \ \left(3\right)$$ $$(1-T)\left(v+Tv+T^2v\dots \right)=v.\ \ \ \ \ \ \left(4\right)$$

(When) do these three properties (weak convergence of $T^n\rightarrow 0$ and properties (3) and (4)) generalize to other operators $T$ at $|\lambda|=r(T)$?

Damalone
  • 341
  • This is certainly not going to work if $T$ actually has $1$ as an eigenvalue. For example, if $T = I$. In case $T$ does not have $1$ as an eigenvalue, $(I - T)^{-1}$ exists as an (closed, densely defined) unbounded operator. At least when $T$ is normal, $(I - T)^{-1} = I + T + T^2 + \cdots$ can be made sense of and is true on a core of the domain. – David Gao Jun 02 '24 at 19:13
  • Thanks. However, I note that if $T=I$, the weakly convergent part does not work, but (4) holds vacuously since the only possibility is v=0. – Damalone Jun 02 '24 at 19:24
  • Yeah. Now that I think about it, I’m not even sure if assuming $T$ is normal and $T$ does not have $1$ as an eigenvalue is enough. The proof I had in mind only works if $\sigma(T) \cap \mathbb{T} = {1}$. I haven’t been able to find a counterexample for when $T^n \to 0$ in weak operator topology though. – David Gao Jun 02 '24 at 19:45
  • Oh wait, it does work. If $v = (I - T)w$, then $(I + T + \cdots + T^n)v = (I + T + \cdots + T^n)(I - T)w = (I - T^{n+1})w \to w$ weakly since $T^n \to 0$ in WOT. So (3) holds and (4) is just a corollary. – David Gao Jun 02 '24 at 19:50
  • How do we prove that the operators $T^n\rightarrow 0$ weakly in this situation? Also, my thought is that (4) might hold even in the eigenvalue situation. – Damalone Jun 02 '24 at 21:44
  • In what situation do you want to prove $T^n \to 0$? I thought you want that as part of your conditions? ($T^n$ can never weakly converge to $0$ if $1$ is an eigenvalue of $T$, so you can’t even make sense of $(I + T + T^2 + \cdots)v$ in that case.) – David Gao Jun 02 '24 at 22:06
  • Sorry, I should not have put parentheses around the operator sum. Please take a look at the revised question and tell me if it makes sense! Thank you! – Damalone Jun 03 '24 at 03:53
  • 1
    I mean, that’s not really the issue here. Consider the following example which demonstrates that even (4) may not work when $T$ has $1$ as an eigenvalue: $T = \begin{pmatrix}1 & -1\0 & 1\end{pmatrix}$. If $w = (0, 1)$, then $v = (1 - T)w = (1, 0)$. But then $Tv = v$, so $v + Tv + \cdots + T^nv = (n+1, 0)$ simply does not converge, even weakly. – David Gao Jun 03 '24 at 03:59
  • 1
    So something like $T^n \to 0$ in WOT seems necessary. I don’t know if there are any general conditions to ensure that is the case though. A sufficient condition is that $T$ is normal, $r(T) = |T| \leq 1$, and the spectral projection of $T$ associated to $\sigma(T) \cap \mathbb{T}$ is $0$. But that is a bit overly restrictive. – David Gao Jun 03 '24 at 04:10
  • I don't think $T^n\rightarrow 0$ is necessary. If we change the matrix to $T=\left(\begin{array}{cc}1&-1\ 0&0.9\end{array}\right)$, then I think (4) will work. – Damalone Jun 03 '24 at 04:36
  • 1
    I suppose you’re right. (3) is equivalent to $T^n \to 0$ in WOT, but (4) can sometimes work without that. In any case though, the point still stands, you cannot expect (4) to always work. At the very least, (4) requires $T^nv \to 0$ weakly for all $v \in \text{ran}(I - T)$. Not sure if that is sufficient. – David Gao Jun 03 '24 at 04:50
  • It is not sufficient. For example, let $T = e_{11} + v$ where $v$ is the unilateral shift on $l^2$ and $e_{11}$ is the projection onto the span of $e_1$. One may verify that $r(T) = 1$. We have $I-T = (I - e_{11}) - v$ has range contained in ${e_1}^\perp$. As $T$ acts as the unilateral shift on ${e_1}^\perp$, $T^nv \to 0$ weakly for all $v \in \text{ran}(I-T)$. But if $w = -e_1$, then $v = (I-T)e_1 = e_2$, and $v+Tv+\cdots$ does not converge, even weakly. So I suppose if you want an equivalent condition for (4), it’s just that $v+Tv+\cdots$ converges weakly for all $v \in \text{ran}(I-T)$. – David Gao Jun 03 '24 at 05:52
  • So, to summarize, the equivalent condition for (3) is $T^n \to 0$ in WOT, and the equivalent condition for (4) is $v + Tv + T^2v + \cdots$ converges weakly for all $v \in \text{ran}(I - T)$, while $T^nv \to 0$ weakly for all $v \in \text{ran}(I - T)$ is a necessary but not sufficient condition. (3) implies (4) but the converse is not true, as your example of $T=\begin{pmatrix}1&-1\0&0.9\end{pmatrix}$ demonstrates. And there are cases where (4) is not even true in general, as my example of $T=\begin{pmatrix}1&-1\0&1\end{pmatrix}$ demonstrates. – David Gao Jun 03 '24 at 05:54
  • (A slight typo: in the “it is not sufficient” comment, it should be $v = (I-T)w = e_2$. Also, I accidentally used $v$ for both a vector in $\text{ran}(I-T)$ and the unilateral shift.) – David Gao Jun 03 '24 at 06:02

0 Answers0