1

I have a $n\times n$ matrix $A$. I know that every element in the matrix belongs to the range $(0, 1)$. I want to prove that $A^k\rightarrow 0$ when $k \rightarrow \infty$. Here $ 0 $ means $n\times n$ zero matrix. I am working on a larger problem and wanted to use this property, which intuitively looks that is correct, but I do not have idea how to prove it.

Thank you.

giliev
  • 111

1 Answers1

3

It's not actually true, unfortunately! Consider the following $2 \times 2$ matrix and $2$ dimensional vector:

$$ A = \left( \begin{array}{c} 2/3 & 2/3 \\ 2/3 & 2/3 \end{array} \right), \;\ v = \left( \begin{array}{c} 1 \\ 1 \end{array} \right). $$

Note that $Av = (4/3)v$, so $A$ has an eigenvalue $\lambda = 4/3 > 1$. So in its eigenbasis ($\det(A) = 0$), we have that $A$ is diagonal with entries $0$ and $4/3$. Take this to the power of $k$ and let $k \to \infty$. We see that this doesn't converge -- let alone converge to $0$ -- since $4/3 > 1$.

[As a passing remark, the other eigenvector is $u = (1, -1)$. This is irrelevant to the proof; we only need its existence.]

Sam OT
  • 6,892
  • Just after posting my answer, I saw your comment! Otherwise I wouldn't have bothered writing it, haha! =P -- Still, always good to check that I can still prove such things cleanly. :) – Sam OT Jan 28 '16 at 20:34
  • Nice example, though. All entries of $A^2$ are $8/9$, of $A^3$ are $32/27$ etc. – Dietrich Burde Jan 28 '16 at 20:39
  • Yep. :) -- For general $n \times n$ it's actually "easier" (whatever that really means in this sense!). Just take $n$ elements all in $(0,1)$ that sum to more than $1$. – Sam OT Jan 28 '16 at 20:42
  • I'm not sure what the general form for $A^k$ in my expression is (easy in the eigenbasis, of course -- could just convert that, but I'm not that invested!). It appears like it could be that all the entries are $2^{2k+1}/3^k$... but that's just a guess from the first three. Actually, yes, it is clearly that by induction, as each one power multiplies the elements by $4/3$ and it starts off with $2/3$! (That's not $2/3$ factorial, obviously!) – Sam OT Jan 28 '16 at 20:43
  • What if we know that every sum of elements per row and every sum of elements per column is smaller than one? Will that guarantee convergence of the matrix? – giliev Jan 29 '16 at 20:50
  • I'll have a think about that. I don't see why that would necessarily bound the eigenvalues... – Sam OT Jan 29 '16 at 21:03
  • I saw somewhere that max_lambda < max_row_sum. I do not have idea why this inequality is true. – giliev Jan 29 '16 at 23:33
  • In fact, yes it is true, it follows from the Gershgorin theorem, which is a very important result for bounding eigenvalues, that I have used a lot when considering whether numerical solutions to PDEs converge or not. See this SE answer. http://math.stackexchange.com/a/202937/132487 – Sam OT Jan 30 '16 at 17:37