1

Given a matrix $A$ which is invertible and with determinant>0, then $ \operatorname{adj} A$ is the matrix such that:

$$ (\operatorname{adj} A) A = A (\operatorname{adj} A) = (\det A) I \tag{1}$$

The construction of $\operatorname{adj} A$ is that we take the cofactor matrix then it's transpose. How would we show that the construction would be the one that leads to property (1) intuitively?

KBS
  • 7,903
  • (1) may not identify a single matrix if $\det A=0$, so you have to weight in on the meaning of "nice". –  Oct 01 '21 at 07:20
  • If you know exterior algebra, it is immediate what adjugate matrix is doing. See this answer for details. – user10354138 Oct 01 '21 at 08:23
  • I suppose traditionally one crunched out sets of simultaneous equations in $3$ variables, then was shown Cramer's Rule, and only long afterwards learned about matrices, inverses, adjoints - but one did recognise the adjoint at that stage. – ancient mathematician Oct 01 '21 at 08:24
  • I think the answer depdends on how you define/think about the determinant, intuitively. If you have a very mechanical view of the determinant as 'whatever number the Laplace expansion process outputs' then the relation (1) is obvious as other answers mention. If you have a different, perhaps more geometric way, of thinking about determinants then it should be possible to manufacture an answer that fits that view of what a determinant is, but you have to tell us your idea of determinant first – Vincent May 19 '22 at 08:01
  • I wrote an answer to this question @Vincent – Clemens Bartholdy May 19 '22 at 08:03
  • ah I see, that is nice. But it only works in the 3 times 3 case, right? – Vincent May 19 '22 at 08:05
  • U can use exterior algebra to generalize the idea you can just make it that you're wedging vector and seeing their components. The only difference is instead of area you'll be talking about generalized $n-volumes$ @Vincent – Clemens Bartholdy May 19 '22 at 08:08
  • 1
    yes that is nice. It then becomes similar to the answer to the question linked by
    @user10354138 above, only in more geometric terms.
    – Vincent May 19 '22 at 08:14

3 Answers3

2

The definition of the adjugate matrix is usually over the cofactor matrix, which is unique and defined for every square matrix $A$ (even if $\det A = 0)$. More precisely, denote by $M_{ij}$ the submatrix of $A$, which is obtained by removing the $i$th row and $j$th column of $A$. Then, the $ij$th entry of the adjugate matrix is defined to be $$ [\mathrm{adj}(A)]_{ij} = (-1)^{i+j} \det M_{ji}. $$ This way, we can directly proof that (1) is fulfilled. Looking at the diagonal entries of $\mathrm{adj}(A)A$, we obtain $$ [A~\mathrm{adj}(A)]_{ii} = \sum_{j} [A]_{ij}[\mathrm{adj}(A)]_{ji} = \sum_{j} (-1)^{i+j} [A]_{ij} \det M_{ij} = \det(A), $$ where in the last step, we identified the sum as the Laplace expansion of the determinant, see e.g., https://en.wikipedia.org/wiki/Determinant#Laplace_expansion. Similarly, the non-diagonal elements $$ [A~\mathrm{adj}(A)]_{ij} = \sum_{k} [A]_{ik}[\mathrm{adj}(A)]_{kj} = \sum_{k} (-1)^{k+j} [A]_{ik} \det M_{jk} = \det(A_{j\to i}), $$ where $A_{j\to i}$ is the matrix obtained from $A$ by replacing the $j$-th row of $A$ by the $i$-th row. This matrix contains twice the $i$-th row and thus is singular. Hence, $\det(A_{j\to i}) = 0$ and the non-diagonal elements of $A~\mathrm{adj}(A)$ vanish. This proves that $$ A~\mathrm{adj}(A) = I\det A. $$ The proof for $\mathrm{adj}(A)A$ works analogously.

KBS
  • 7,903
Andreas Lenz
  • 1,616
1

The adjugate matrix can be constructed the following way:

$$ adj( \begin{bmatrix} p &q & r \end{bmatrix}) = \begin{bmatrix} q \times r & p \times r & p \times q \end{bmatrix}$$

Where $\times$ is the cross product operator. It is clear that if we multiply ,

$$\begin{bmatrix} q \times r \\ p \times r \\ p \times q \end{bmatrix} \begin{bmatrix} p & q &r \end{bmatrix} = \det A$$

If one doesn't see the above immediately, expand everything out in components and do it. The above construction makes everything clear, and in my personal opinion, also easier to compute.

extracted from here

0

One can understand this connestrong textction in simple terms as:

Suppose $A=[a_{ij}]_{n \times n}$ and let its cofactor matrix be $[c_{ij}]_{n \times n}$, then we know that $$\sum_{i=1}^{n} a_{p,j} c_{q,j}=\det(A) \delta_{p.q},$$ where $\delta_{p,q}=1, ~if~ p=q; 0$, otherwise. Next, $$A. adj(A)=A. C^T=\sum_{k=1}^{n} a_{ik}c_{kj}^T=\sum_{k=1}^n a_{ik}c_{jk}$$$$=\det(A) \delta_{i,j}= \det(A) I$$