5

Question

Let $A\in \text{SL}(2,\mathbb{C})$, so $\det(A)=1$. Define the following (Pauli) matrices:

$$\begin{align} \sigma_0=\begin{pmatrix}-1 & 0 \\ 0 & -1 \end{pmatrix} & &\sigma_1=\begin{pmatrix}0 & 1 \\ 1 & 0 \end{pmatrix} & \\ &\\ \sigma_2=\begin{pmatrix}0 & -i \\ i & 0 \end{pmatrix} & &\sigma_3=\begin{pmatrix}1 & 0 \\ 0 & -1 \end{pmatrix} & \end{align} $$

Now define the following $4\times 4$ matrix.

$$L_{\mu\nu}\equiv \frac{1}{2}\text{Tr}\left[A\sigma_\mu A^\dagger \sigma_\nu\right]$$

What I am trying to prove is $\det (L)=1$, that's it. But I am having so much trouble. I have verified that it is true via Mathematica. The following is know exactly:

$$\det (L)=\left|\det(A)\right|^4=1$$

Actually, it would be sufficient for my current purposes to show $\det(L)\geq 0$, but even that is very hard for me to show.


Mathematica Code.

ClearAll[s0, s1, s2, s3, s, A]; (* Define the Pauli-Matrices and A \
matrix *)
s0 = {{-1, 0}, {0, -1}};
s1 = {{0, 1}, {1, 0}};
s2 = {{0, -I}, {I, 0}};
s3 = {{1, 0}, {0, -1}};
s = {s0, s1, s2, s3};
A = {{a, b}, {c, d}};

L00 = 1/2* Tr[A.s0.A[ConjugateTranspose].s0]; (* Define L(A) through the
equation LSubscript[(A)^[Mu], [Nu]] = 1/2Tr[Subscript[A[Sigma],
[Mu]]A[ConjugateTranspose]Subscript[[Sigma], [Nu]]] ) L01 = 1/2Tr[A.s0.A[ConjugateTranspose].s1]; L02 = 1/2Tr[A.s0.A[ConjugateTranspose].s2]; L03 = 1/2Tr[A.s0.A[ConjugateTranspose].s3]; L10 = 1/2Tr[A.s1.A[ConjugateTranspose].s0]; L11 = 1/2Tr[A.s1.A[ConjugateTranspose].s1]; L12 = 1/2Tr[A.s1.A[ConjugateTranspose].s2]; L13 = 1/2Tr[A.s1.A[ConjugateTranspose].s3]; L20 = 1/2Tr[A.s2.A[ConjugateTranspose].s0]; L21 = 1/2Tr[A.s2.A[ConjugateTranspose].s1]; L22 = 1/2Tr[A.s2.A[ConjugateTranspose].s2]; L23 = 1/2Tr[A.s2.A[ConjugateTranspose].s3]; L30 = 1/2Tr[A.s3.A[ConjugateTranspose].s0]; L31 = 1/2Tr[A.s3.A[ConjugateTranspose].s1]; L32 = 1/2Tr[A.s3.A[ConjugateTranspose].s2]; L33 = 1/2Tr[A.s3.A[ConjugateTranspose].s3]; L = { {L00, L01, L02, L03}, {L10, L11, L12, L13}, {L20, L21, L22, L23}, {L30, L31, L32, L33} };

TraditionalForm[ FullSimplify[ Det[L]]] (* Evaluate the determinant explicitly, and put it in
legible form *)

  • How did you understand the first line of Dave's proof? – scitamehtam Jun 24 '21 at 16:14
  • @scitamehtam When I first read the answer a while ago, it somehow made sense to me, but now I don't see how it answers my question. Let me keep trying my hand at it. – Arturo don Juan Jun 25 '21 at 06:19
  • @scitamehtam After going through Dave's answer, I cannot see how it applies to my question. I don't know how it made sense to me before. In the meantime, I hope you can see that the brute-force calculation (as per my mathematica code) suffices to prove the relation. – Arturo don Juan Jun 25 '21 at 23:24
  • 1
    Thanks for your answer. Your code also helped me save time :) – scitamehtam Jun 26 '21 at 00:00
  • 1
    @scitamehtam No problem. I wish there were a more elegant way to show $\det L = 1$ here. I'm guessing you're also interested in proving that $SL(2,\mathbb{C} )$ is the double covering of $SO(1,3)^{\uparrow}$ via the mapping $A\rightarrow L_{\mu\nu}(A)$? – Arturo don Juan Jun 29 '21 at 01:09
  • @scitamehtam I understand now the answer written by KBDave. The only thing I cannot justify is writing an arbitrary element of $\textrm{SL}(2,\mathbb{C})$ as an exponential $A=e^{\vec{a}\cdot\vec{\tau}}$. – Arturo don Juan Jun 30 '21 at 03:59
  • Looking back at my own answer, I find I don't like it very much, and it is probably much cleaner to use the methods in the answer to this question: https://math.stackexchange.com/q/1316594/534616 – K B Dave Jun 30 '21 at 22:36
  • @scitamehtam take a look at the new answer, it's actually quite straightforward – Arturo don Juan Aug 18 '21 at 19:36

2 Answers2

3

This can be proven via vectorization. We define the vectorization of a 2-by-2 matrix as the vector made by stacking columns:

$$ \mathrm{vec}(A) = \left(\begin{matrix}A_{00}\\A_{10}\\A_{01}\\A_{11}\end{matrix}\right), $$

which maps any 2-by-2 complex matrix to a vector in $\mathbb{C}^4$. It also maps the Frobenius inner product into a dot product,

$$ \langle A, B\rangle = \mathrm{tr}(A^\dagger B) = \sum_{i,j=1}^2 A^*_{ij}B_{ij} = \mathrm{vec}(A)^\dagger \mathrm{vec}(B),$$

and there is a nice relationship between the vectorization of a product of matrices and the Kronecker product. For complex $n$-by-$n$ matrices $A,B,C$:

$$ \mathrm{vec}(ABC) = \left(C^\mathrm{T}\otimes A\right) \mathrm{vec}(B). $$

Starting from the definition of $L$ (with $\tau_\mu = \sigma_\mu/\sqrt{2}$ such that $\langle \tau_\alpha, \tau_\beta\rangle = \delta_{\alpha\beta}$):

\begin{align*} L_{\mu\nu} &= \mathrm{tr}(A\tau_\mu A^\dagger \tau_\nu),\\ &= \langle \tau_\mu, A^\dagger\tau_\nu A \rangle,\\ &= \mathrm{vec}(\tau_\mu)^\dagger \mathrm{vec}\left(A^\dagger\tau_\nu A\right),\\ &= \mathrm{vec}(\tau_\mu)^\dagger\left(A^\mathrm{T}\otimes A^\dagger\right)\mathrm{vec}\left(\tau_\nu\right).\end{align*}

This shows that $L_{\alpha\beta}$ are the coefficients of $A^\mathrm{T}\otimes A^\dagger$ in the (orthonormal) basis given by ${\mathrm{vec}(\tau_\alpha)}$. The determinant is therefore given by:

\begin{align*} \mathrm{det}(L) &= \mathrm{det}\left(A^\mathrm{T}\otimes A^\dagger\right) = \mathrm{det}(A^\mathrm{T})^2\mathrm{det}(A^\dagger)^2 = |\mathrm{det}(A)|^4. \end{align*}

Where we have used $\mathrm{det}(A\otimes B) = \mathrm{det}(A)^n\mathrm{det}(B)^m$ with $A$ a $m$-by-$m$ matrix and $B$ a $n$-by-$n$ matrix.

GEB24
  • 318
  • 1
    wow, I have waited years for this. I always knew an elegant solution should exist for this, never thought it would be so straightforward. – Arturo don Juan Aug 18 '21 at 19:35
2

$\renewcommand{\vec}[1]{\boldsymbol{#1}}$ $\DeclareMathOperator{\Tr}{Tr}$

Let $\vec{\tau}=\mathrm{i}\vec{\sigma}$, $A=\mathrm{e}^{\vec{a}\cdot{\vec{\tau}}}$. Then the goal is to find $\det\mathsf{A}_1$, where $\mathsf{A}_t$ is the linear operator $$\mathsf{A}_t(x)=\mathrm{e}^{t\vec{a}\cdot\vec{\tau}}x\mathrm{e}^{-t\vec{a}^*\cdot\vec{\tau}}\text{.}$$ (Here $x=x^0+\vec{x}\cdot\vec{\tau}$—slightly different from your convention). That's because the four $\sigma$ matrices are an orthonormal basis for the space of $2\times 2$ complex matrices with respect to the inner product $(X, Y)\mapsto \tfrac{1}{2} \Tr X^{\dagger}Y$, so that $L_{\mu\nu}$ is the matrix of elements of $\mathsf{A}_1$ with respect to this basis.

By differentiating with respect to $t$ we find $$\frac{\mathrm{d}}{\mathrm{d}t}\ln \det \mathsf{A}_t=\Tr \mathsf{a}$$ where $\mathsf{a}$ is the linear operator $$\mathsf{a}(x)=\vec{a}\cdot\vec{\tau}x - x\vec{a}^*\cdot\vec{\tau}\text{.}$$ Written out in components, $\mathsf{a}(x)$ is given by $$[\mathsf{a}][x]= \begin{bmatrix}0 & (\vec{a}-\vec{a}^*)\cdot\\ \vec{a}-\vec{a}^* &-(\vec{a}+\vec{a}^*)\times \end{bmatrix} \begin{bmatrix}x^0 \\ \vec{x} \end{bmatrix}\text{.}$$ The diagonal components of this $4\times 4$ matrix vanish, so $\Tr \mathsf{a}=0$, so $\det \mathsf{A}_t$ is constant in $t$, so $\det \mathsf{A}_t=1$, whence $\det\mathsf{A}_1=1$ as required.

K B Dave
  • 9,458
  • 2
    Could you please explain how the goal becomes finding detA_1 more? – scitamehtam Jun 24 '21 at 05:09
  • @scitamehtam added brief explanation – K B Dave Jun 27 '21 at 03:01
  • @KBDave Thanks for the update. I'm still trying to understand it fully, but in the meantime I have a concern. It seems you're saying that my $A\in \textrm{SL}(2,\mathbb{C})$ is to be represented by the exponential $\exp (\textbf{a}\cdot\vec{\tau})$, but not every element of $\textrm{SL}(2,\mathbb{C})$ can be written as an exponential, right? – Arturo don Juan Jun 29 '21 at 20:12
  • 1
    @Arturo don Juan you're right, so it seems that this argument isn't finished without knowing that $\mathrm{SL}(2,\mathbb{C})$ is connected and using multiplication of exponentials and multiplicativity of the determinant to generate it. – K B Dave Jun 30 '21 at 22:33
  • @KBDave For the case when $A$ can be written as an exponential (of a traceless matrix), your answer seems so elegant. Maybe if we use the fact that any element of $\textrm{SL}(2,\mathbb{C})$ can be written as a product of a matrix belonging to $SU(2)$ (i.e. an exponential $\exp (\vec a \cdot \vec \tau)$), with a hermitian matrix with trace 1 and positive eigenvalues... – Arturo don Juan Jul 01 '21 at 01:49
  • @KBDave I deleted some comments because I realized that although the arguments carry through basically unchanged if we use that $A$ can be decomposed as $A=HU$ - we still get $\textrm{Tr} a = 0$ - what I cannot show is that $\det \textrm{A}(t=0)=1$. – Arturo don Juan Jul 02 '21 at 20:47