28

The inverse of a non-singular lower triangular matrix is lower triangular.

Construct a proof of this fact as follows. Suppose that $L$ is a non-singular lower triangular matrix. If $b \in \mathbb{R^n}$ is such that $b_i = 0$ for $i = 1, . . . , k \leq n$, and $y$ solves $Ly = b$, then $y_i = 0$ for $i = 1, . . . , k \leq n$.

Hint: partition $L$ by the first $k$ rows and columns.

Can someone tell me what exactly we are showing here and why it will prove that the inverse of any non-singular lower triangular matrix is lower triangular?

sonicboom
  • 10,273
  • 15
  • 54
  • 87

5 Answers5

33

Let's write $$L^{-1}=[y_1\:\cdots\:y_n],$$ where each $y_k$ is an $n\times 1$ matrix.

Now, by definition, $$LL^{-1}=I=[e_1\:\cdots\:e_n],$$ where $e_k$ is the $n\times 1$ matrix with a $1$ in the $k$th row and $0$s everywhere else. Observe, though, that $$LL^{-1}=L[y_1\:\cdots\:y_n]=[Ly_1\:\cdots\: Ly_n],$$ so $$Ly_k=e_k\qquad(1\leq k\leq n)$$

By the proposition, since $e_k$ has only $0$s above the $k$th row and $L$ is lower triangular and $Ly_k=e_k$, then $y_k$ has only $0$s above the $k$th row. This is true for all $1\leq k\leq n$, so since $$L^{-1}=[y_1\:\cdots\:y_n],$$ then $L^{-1}$ is lower triangular, too.

$$********$$

Here's an alternative (but related) approach.

Observe that a lower triangular matrix is nonsingular if and only if it has all nonzero entries on the diagonal. Let's proceed by induction on $n$. The base case ($n=1$) is simple, as all scalars are trivially "lower triangular". Now, let's suppose that all nonsingular $n\times n$ lower triangular matrices have lower triangular inverses, and let $A$ be any nonsingular $(n+1)\times(n+1)$ lower triangular matrix. In block form, then, we have $$A=\left[\begin{array}{c|c}L & 0_n\\\hline x^T & \alpha\end{array}\right],$$ where $L$ is a nonsingular $n\times n$ lower triangular matrix, $0_n$ is the $n\times 1$ matrix of $0$s, $x$ is some $n\times 1$ matrix, and $\alpha$ is some nonzero scalar. (Can you see why this is true?) Now, in compatible block form, we have $$A^{-1}=\left[\begin{array}{c|c}M & b\\\hline y^T & \beta\end{array}\right],$$ where $M$ is an $n\times n$ matrix, $b,y$ are $n\times 1$ matrices, and $\beta$ some scalar. Letting $I_n$ and $I_{n+1}$ denote the $n\times n$ and $(n+1)\times(n+1)$ identity matrices, respectively, we have $$I_{n+1}=\left[\begin{array}{c|c}I_n & 0_n\\\hline 0_n^T & 1\end{array}\right].$$ Hence, $$\left[\begin{array}{c|c}I_n & 0_n\\\hline 0_n^T & 1\end{array}\right]=I_{n+1}=A^{-1}A=\left[\begin{array}{c|c}ML+bx^T & M0_n+b\alpha\\\hline y^TL+\alpha x^T & y^T0_n+\beta\alpha\end{array}\right]=\left[\begin{array}{c|c}ML+bx^T & \alpha b\\\hline y^TL+\alpha x^T & \beta\alpha\end{array}\right].$$ Since $\alpha$ is a nonzero scalar and $\alpha b=0_n$, then we must have $b=0_n$. Thus, $$A^{-1}=\left[\begin{array}{c|c}M & 0_n\\\hline y^T & \beta\end{array}\right],$$ and $$\left[\begin{array}{c|c}I_n & 0_n\\\hline 0_n^T & 1\end{array}\right]=\left[\begin{array}{c|c}ML & 0_n\\\hline y^TL+\alpha x^T & \beta\alpha\end{array}\right].$$ Since $ML=I_n$, then $M=L^{-1}$, and by inductive hypothesis, we have that $M$ is then lower triangular. Therefore, $$A^{-1}=\left[\begin{array}{c|c}M & 0_n\\\hline y^T & \beta\end{array}\right]$$ is lower triangular, too, as desired.

Cameron Buie
  • 105,149
  • Cheers but I am aware of that case already, it is the specific problem in the question that I have an issue with, i.e. why the proposition implies that any lower triangular matrix will be lower triangular. – sonicboom Nov 27 '12 at 20:48
  • 3
    I'm not sure you're saying what you intend to say. "...[A]ny lower triangular matrix will be lower triangular"? Of course it will. Did you mean "...implies that the inverse of any nonsingular lower triangular matrix will be lower triangular"? – Cameron Buie Nov 27 '12 at 21:00
  • Ok, I have edited the original question to fix that mistake. – sonicboom Nov 27 '12 at 23:47
  • 1
    Does my edited answer do the trick for you? – Cameron Buie Nov 27 '12 at 23:49
  • 1
    Thanks alot mate, I understand the process now. – sonicboom Nov 28 '12 at 13:47
  • Nevermind, have to check something. – sonicboom Nov 28 '12 at 14:11
  • Working though it here now and it saysIf $b \in \mathbb{R^n}$ is such that $b_i = 0$ for $i = 1, . . . , k \leq n$...which means $b$ can't be used as the columns in an identity matrix. As it says $b_i$ has a zero on the $i^{th}$ column. Is there an error in the question, should it be $b_i = 0$ for $i = 1, . . . , (k-1) \leq n$? – sonicboom Nov 28 '12 at 14:19
  • There's nothing really wrong with the proof, but it considers an extra case that isn't really necessary for us--namely, the $k=n$ case. In that case, $b$ is the zero vector, and since $L$ is nonsingular, then $Ly=b$ implies $y=0$. None of the columns of the identity matrix are the zero vector, so we didn't need that case. Ignoring that case, then saying "$b_i=0$ for $i=1,...,k<n$" is equivalent to saying "$b$ is all $0$s above the $(k+1)$th row." $e_{k+1}$ satisfies this condition for each $1\leq k<n$. We don't need to worry about $0$ entries in $e_1$. – Cameron Buie Nov 28 '12 at 15:20
6

Suppose you have an invertible lower-triangular matrix $L$. To find its inverse, you must solve the matrix equation $LX = I$, where $I$ denotes the $n$-by-$n$ identity matrix.

Based on how matrix multiplication works, the $i^{\text{th}}$ column of $LX$ is equal to $L$ times the $i^{\text{th}}$ column of $X$. In order for $LX = I$, it must be that the first $i-1$ entries in the $i^{\text{th}}$ column of $LX$ are all zero. The hint is that you can prove that this implies that the first $i-1$ entries in the $i^{\text{th}}$ column of $X$ must all be zero. To do this, you can explicitly write out your calculation, using your assumption that $L$ is lower-triangular. You'll get a fairly easy system of linear equations to analyze.

Michael Joyce
  • 14,386
  • I understand that and can prove that case. It is the specific problem in the question that I have an issue with, i.e. why the proposition implies that any lower triangular matrix will be lower triangular – sonicboom Nov 27 '12 at 20:48
  • Take $y$ to be a column of the matrix $X$ and $b$ to be the corresponding column of the identity matrix. – Michael Joyce Nov 27 '12 at 22:28
  • Ok, cheers, I'll have to think about it for a bit. – sonicboom Nov 27 '12 at 23:51
  • It may help to point out that when you are considering the $j^{\text{th}}$ column, you should take $k = j-1$. – Michael Joyce Nov 27 '12 at 23:53
3

In simple form, we can write A = D*(I+L); where A is lower triangular matrix, D is diagonal matrix, I is identity matrix and L is lower triangular with all zeros in diagonal. Since $A^{-1} = (I+L)^{-1}*D^{-1}$ and inverse of D is simply inverse of diagonal element. And for very large n $L^{-n} = 0$ since it is having only lower triangular elements. And we can write $(I+L)^{-1} = I - L + L^2 - L^3 + .... (-1)^n*L^n$ which itself is lower triangular matrix.

2

I was thinking about this same question and have an explanation from an informal perspective:

With invertible matrix $A$ and $$LA = B$$

We know that the first row of $B$ is a multiple of the first row of $A$, and the second row of $B$ is a linear combination of the first two rows of $A$, ..., the $k$th row of $B$ is a linear combination of the first $k$ rows of $A$,...

It follows that for any $k$, the first $k$ rows of $A$ and the first $k$ rows of $B$ span the same subspace. Therefore the $k$th row of $A$ is in the subspace spanned by the first $k$ rows of $B$. Furthermore, the $k$th row of $A$ cannot be in the subspace spanned by the first $k-1$ rows of $B$. Otherwise it is in the subspace spanned by the first $k-1$ rows of $A$, which contradicts the assumption that $A$ is invertible (rows are linearly independent). Because for any $k$ the $k$th row of $A$ is a linear combination of the first $k$ rows of $B$, for $$L^{-1}B=A$$ $L^{-1}$ must be lower triangular.

1

Let $\mathbb{K} \in \{\mathbb{R},\mathbb{C}\}$ be either the field of real numbers or the field of complex numbers.

Definition (Lower triangular matrix). A (symmetric) lower triangular matrix is a matrix $\mathbf{L} \in \mathbb{K}^{n \times n}$ such that \begin{equation} \left[\mathbf{L}\right]_{ij} = \ell_{ij} = 0 \qquad (\,i < j\,). \end{equation}
Proposition. The determinant of a lower triangular matrix is the product of its diagonal entries.
proof. If $\mathbf{L} \in \mathbb{K}^{n \times n}$ is lower triangular, then $\mathrm{det}(\mathbf{L}) = \ell_{11} \det(\mathbf{L}_{2:n,2:n}) = \ell_{11}\ell_{22} \det(\mathbf{L}_{3:n,3:n}) = \prod_{i=1}^n \ell_{ii}$ by induction. $\square$

Proposition. A lower triangular matrix is invertible if and only if all its diagonal entries are non-zero.
proof. (Follows directly from the previous proposition.) $\square$

Proposition. If $\mathbf{L} \in \mathbb{K}^{n \times n}$ is lower triangular, then its inverse $\mathbf{M} := \mathbf{L}^{-1}$ is lower triangular as well.
proof. We wish to show that $\mathbf{M} := \mathbf{L}^{-1}$ is lower triangular, or equivalently, that for each row $i=1,2,\ldots,n$ of $\mathbf{M}$, we have $m_{ij} = 0$ for $i < j$. We proceed by induction. Notice that \begin{equation} \delta_{ij} = \left[\mathbf{I}\right]_{ij} = \left[\mathbf{L}\mathbf{L}^{-1}\right]_{ij} = \left[\mathbf{L}\mathbf{M}\right]_{ij} = \sum_{k=1}^n \ell_{ik} m_{kj} = \sum_{k=1}^i \ell_{ik} m_{kj}, \end{equation} using the Kronecker delta $\delta_{ij}$ and the fact $\mathbf{L}$ is lower triangular so $\ell_{ik} = 0$ for $i < k$. For the base case $i = 1$, we have that for each $1 < j$, \begin{equation} 0 = \delta_{1j} = \sum_{k=1}^1 \ell_{1k} m_{kj} = \ell_{11} m_{1j} \qquad (\,1 < j\,). \end{equation} Hence, because $\mathbf{L}$ invertible and by the proposition above $\ell_{11} \neq 0$, we have \begin{equation} m_{1j} = 0 \qquad (\,1 < j\,). \end{equation} Now, assume that for some $p=2,3,\ldots,n$, the induction hypothesis $m_{(p-1)j} = 0$ for all $p-1 < j$ holds. Then, for the row $i = p$, for each $p < j$, \begin{equation} 0 = \delta_{pj} = \sum_{i=1}^p \ell_{pk} m_{kj} = \sum_{i=1}^{p-1} \ell_{pk} \underbrace{m_{kj}}_{=0} + \ell_{pp} m_{pj} = \ell_{pp} m_{pj} \qquad (\,p < j\,). \end{equation} And again, because the diagonal entry $\ell_{pp} \neq 0$, we have \begin{equation} m_{pj} = 0 \qquad (\,p < j\,). \end{equation} Concluding, we have proven by induction that, for each row $i = 1,2,\ldots,n$ the entry $m_{ij} = 0$ for $i < j$, or equivalently, that $\mathbf{M} := \mathbf{L}^{-1}$ is lower triangular. $\square$

Remark. (Upper triangular matrix) Similarly, the inverse of an upper triangular matrix is upper triangular.