41

I'd like to know how can be shown that $\det(A \otimes B) = \det(A)^m \det(B)^n$ when $A$ and $B$ are square matrices of size $n$ and $m$ respectively and $\otimes$ represents the Kronecker product of $A$ and $B$.

I've seen some proofs using eigenvalues of $A$ and $B$, but since not every matrix has eigenvalues (at least not in all rings), I'd like a more intrinsic proof, maybe using the fact that $M_{(nm)^2} \cong M_{n^2} \otimes M_{m^2}$.

It all comes down to showing that $\det(A \otimes I) = \det(A)^m$ and using some smart property (e.g. $(A\otimes B)(C\otimes D) = (AC) \otimes (BD)$), but I wasn't able to do even that.

Jonas Gomes
  • 3,244
  • Total antisymmetry and linearity of $det$ will do the trick. Alternatively it can also be done using the $\varepsilon$ tensor. – Rogelio Molina Jun 08 '15 at 04:39
  • 1
    Rogelio, actually I've tried using antisymmetry of det, but I was not able to use linearity. Could you elaborate? – Jonas Gomes Jun 08 '15 at 04:40
  • \begin{equation} \begin{split} det[A\otimes B] &= \det[(A\otimes I)(I\otimes B)] \&= \det(A\otimes I)\det(I\otimes B) \&= \det(A)^m\det(B)^n \end{split} \end{equation} – Car Loz Nov 29 '23 at 03:21

4 Answers4

41

Here's another approach. Consider $A \otimes {\bf 1}_m$, we will show that this matrix can always be brought to the block form

$$\left( \begin{array}{cccc} A & 0 & \cdots & 0 \\ 0 & A & \cdots & 0 \\ 0 & \cdots & \cdots & A \\ \end{array}\right) $$ To this end consider the matrix $A$ with components $(a_{ij})$ in some basis, say $\{u_i\}$ of a vector space $V$ with dimension $n$ over a ring $R$. Consider also the identity ${\bf 1}_m$ over the vector space $W$ of dimension $m$ over $R$ also. We will use the basis $\{ u_i \otimes e_a \}$ for the space $V \otimes W$, being $i,j=1,\cdots,n$ and $a,b=1, \cdots, m$. Let us further choose an ordering for the basis, this ordering will be

$$ \{u_1 \otimes e_1, u_1\otimes e_2, \cdots ,u_1 \otimes e_m, u_2 \otimes e_1, \cdots ,u_n \otimes e_m \} $$ Let us look at the form of the operator $A \otimes {\bf 1}_m$ in this basis, we shall see that it is the block form given above. Consider the action:

$$ (A \otimes {\bf 1}_m)u_{i}\otimes e_a = Au_i \otimes {\bf 1}_m e_a = \sum_{j,b} a_{ij}\delta_{ab} u_j \otimes e_b $$ This means that the matrix element in this basis is $(A \otimes {\bf 1}_m)_{ia,jb} = a_{ij}\delta_{ab}$. This is the block form we are aiming to get, for notice that this matrix element is only distinct of zero when $a=b$, that is along the diagonal of a $m \times m$ block matrix, and in each block one has the matrix $(a_{ij})$, which is the operator $A$ in the basis $\{u_i\}$. The determinant is independent of the basis chosen.

Now take the determinant of this block matrix, it is easy to show that this determinant is $\det(A)^m$. Finally, as you pointed out yourself write

$$ A \otimes B = (A \otimes {\bf 1})({\bf 1}\otimes B) $$ and use $\det(MN) = \det M \cdot \det N$. This works with any ring.

  • I am guessing that the existence of this well know basis rests on the fact that the column vectors of A span $M_n$ (am I right?) – Jonas Gomes Jun 08 '15 at 05:05
  • Indeed, that is correct. – Rogelio Molina Jun 08 '15 at 05:09
  • That maybe a problem if we are not in a field (if we were in a field and A were singular, then $A \otimes 1$ would also be singular and hence the result would follow), but for an arbitrary ring I don't see how we could proceed. I've thought about replacing $A$ by $A+x I$ and looking at the polynomial identities, but since we are relying on extracting a basis from $A+ xI$, I don't see this approach working – Jonas Gomes Jun 08 '15 at 05:14
  • Why is it a problem if $A$ is singular? The proof remains valid as far as I can tell. – Rogelio Molina Jun 08 '15 at 05:33
  • If $A$ is singular the column vectors of $A$ no longer span $M_n$ – Jonas Gomes Jun 08 '15 at 05:43
  • 1
    Sorry! No, I see... I had misunderstood your previous statement. I apologise, it is late at night here and I am very tired. In fact you can always bring $ A \otimes {\bf 1}$ to the block form given above, regardless of the rank of $A$ or if you are working in a field or a ring. The way to do it is choose the basis $u_i \otimes e_a$ where $e_a$ is the canonical basis. – Rogelio Molina Jun 08 '15 at 05:46
  • In the worst case you have to choose some ordering of the basis and maybe some row operations could be necessary, but it is always possible to do so, and by trying one can see it is not hard either. – Rogelio Molina Jun 08 '15 at 05:47
  • Rogelio, could you elaborate on how one should change basis? (Maybe edit your answer) I was not able to follow your idea – Jonas Gomes Jun 09 '15 at 15:26
  • 1
    It is done, I have edited my answer, hopefully it is explained better now than before. – Rogelio Molina Jun 09 '15 at 17:47
  • I think the basis chosen is not quite right, since we can pick instead the basis ${u_1 \otimes e_1, u_2 \otimes e_1, \cdots , u_n \otimes e_1, \cdots , u_n \otimes e_m } $, this provides the desired form, if we use the basis in the answer, we get the initial $A \otimes {\bf 1}_m$, since the basis 'groups' $u$ together, this creates $n$ "blocks" of size $m^{2}$, whereas what we want : $m$ blocks of size $n^{2}$, we should 'group' the $e$ instead. Please correct me if this is wrong – hteica Nov 03 '20 at 03:30
8

HINT:

It's enough to show the identity holds for matrices with entries from $\mathbb{C}$. Moreover, it's enough to show it for complex diagonalizable matrices, a dense subset. So show that if $A$ is diagonalizable with eigenvalues $\lambda_i$ and $B$ diagonalizable with eigenvalues $\mu_j$, then $A\otimes B$ is diagonalizable with eigenvalues $\lambda_i \cdot \mu_j$. While here, you can also show that the eigenvalues of $A\otimes I_m + I_n \otimes B$ are $\lambda_i + \mu_j$.

orangeskid
  • 56,630
7

The most intrinsic proof is probably showing $\det(V\otimes W) \cong \det(V)^{\otimes\text{rk}(W)} \otimes \det(W)^{\otimes\text{rk}(V)}$, using the universal properties involved, and that under this (natural) isomorphism, $\det(f\otimes g)$ becomes $\det(f)^{\otimes\text{rk}(W)} \otimes \det(g)^{\otimes\text{rk}(V)}$. Here, $f\!: V \to V$ and $g\!: W \to W$ are endomorphisms (represented by $A$ resp. $B$). This also works over rings.

  • This is a really nice approach. Does it also do the characteristic polynomial of $A\otimes B$ ? – orangeskid Jun 08 '15 at 05:10
  • Thanks for your answer Engloutie. As I am not familiar with the notation you have used ($det(V \otimes W)$ for instance), I'd ask for a basic book recommendation to clear up my notations and further understand your proof. Could you provide one? – Jonas Gomes Jun 08 '15 at 05:23
  • 4
    @JonasGomes The determinant of a (free) module / vector space is just its highest exterior power, that is, $\det(M) = \Lambda^{\text{rk}(M)}M$. The reason is that the determinant of a morphism is the induced morphism on this, which is just multiplication by a scalar. – Thomas Poguntke Jun 08 '15 at 05:32
  • 1
    I don't understand the argument you are proposing: how do you show that $\det(V\otimes W)$ and $\det(V)^{\otimes\text{rk}(W)} \otimes \det(W)^{\otimes\text{rk}(V)}$ satisfy the same universal property, without picking a basis? And if you do it using a basis, then it is not at all obvious that the resulting isomorphism is natural, and indeed checking that $\det(f\otimes g)$ becomes $\det(f)^{\otimes\text{rk}(W)} \otimes \det(g)^{\otimes\text{rk}(V)}$ is exactly what you have to prove to show that it is natural. – Eric Wofsey Dec 16 '19 at 22:10
5

Use the commutation matrix; recall that Km,n(A⊗B)=(B⊗A)Km,n. It follows that det(A⊗B)=det(B⊗A). Then det(A⊗B)=det((A⊗I)(I⊗B))=det(I⊗A)det(I⊗B)=det(A)^m det(B)^n (since I⊗A is a block diagonal matrix with all blocks on the diagonal equal to A, and the determinant of a block diagonal matrix equals the product of the determinants of the blocks).