6

For an skew-symmetric matrix $A$ (meaning $A^T=-A$), the Pfaffian is defined by the equation $(\text{Pf}\,A)^2=\det A$. It is my understanding that this is defined for anti-symmetric matrices because it is known that the determinant of an anti-symmetric matrix is always a square of a polynomial in the entries of the matrix.

Now, skew-symmetry is sufficient to prove that the determinant is a square of a polynomial, but it is not necessary. The simplest example is the $2n\times 2n$ matrix $A=a I_{2n}$ with $a\in\mathbb{C}$ and $I_k$ the $k\times k$ identity matrix. The determinant is $\det A = a^{2n} = (a^n)^2$. Of course, for $a\neq 0$, $A$ is not skew-symmetric.

I have a few questions about this.

  1. Is there a generalization of a Pfaffian for any matrix whose determinant is a square of a polynomial?
  2. Is there a characterization (or some known set of properties) of matrices whose determinants are squares of polynomials?
  3. (Edit) Are there any known necessary and sufficient conditions for a matrix to have its determinant be the square of a polynomial (aside from skew-symmetry being sufficient)?

(Edit 2) For those who are curious, these questions arise from a problem from physics I am working on. I have a certain class of matrices whose characteristic polynomials (which arise as the determinant of a non-skew-symmetric matrix) appear to be the squares of Chebyshev polynomials. If I could prove that these characteristic polynomials must be squares of polynomials (using properties of the matrix) then I may be able to use some of the properties attributed to Pfaffians (or the proper generalization to non-skew-symmetric matrices) to confirm that they are indeed squared Chebyshev polynomials.

(Edit 3) To be as concrete as possible, I am looking for any information (e.g., answers to questions 1-3) on the set $$\{A\in\mathcal{M}_n(\mathbb{C}): \det A = p(\{a_{ij}\})^2\text{ with }p\text{ a polynomial} \}$$ where $\mathcal{M}_n(\mathbb{C})$ is the set of $n\times n$ complex matrices and $a_{ij}$ is the $i,j$'th entry of $A$.

  • 1
    Determinants can be squares for various random reasons. Are you asking about certain families of matrices whose determinants are squares? Otherwise I'd say it's a rather vague question. – darij grinberg Nov 27 '18 at 19:44
  • Presumably you don't really mean "a matrix" (with numerical entries), but rather a set (maybe a linear space?) of matrices. – Robert Israel Nov 27 '18 at 19:44
  • Little precision (see what I have enclosed between brackets) "the determinant of an anti-symmetric matrix [with even size] is always a square of a polynomial in the entries of the matrix." – Jean Marie Nov 27 '18 at 19:46
  • @JeanMarie $0$ is also a square of a polynomial. – Robert Israel Nov 27 '18 at 19:51
  • @Robert Israel : seen like that ... I can but agree. – Jean Marie Nov 27 '18 at 19:52
  • 1
    I don't know if there exist an extension of Pfaffians to matrices other than the antisymmetric ones. But if you are interested into these polynomials, here is a paper that enlarges the point of view, in particular by using exterior algebra : http://www.kurims.kyoto-u.ac.jp/~kyodo/kokyuroku/contents/pdf/1302-14.pdf – Jean Marie Nov 27 '18 at 19:59
  • @darijgrinberg Yes, maybe I am being a bit vague. I'm interested to know if there are results regarding necessary and/or sufficient conditions for matrix to have it's determinant be a square of a polynomial. For example, skew-symmetry is sufficient but not necessary. – UglyMousanova19 Nov 27 '18 at 23:17
  • 1
    @JeanMarie Thanks for the response. I have actually read quite a bit on Pfaffians in the context of Soliton theory (in particular the book "The Direct Method in Soliton Theory" by Hirota - same author as that paper you linked). They have many nice properties that could be very useful, but are defined only for anti-symmetric matrices (and rely on this fact quite heavily it seems). – UglyMousanova19 Nov 27 '18 at 23:21
  • 1
    Maybe a way to rephrase your question to appeal to those who deem it vague: Let $R$ be the ring of polynomials in matrix entries, so $\det\in R$. Let us call an ideal $I\subseteq R$ Pfaffian if $\det$ becomes a square in $R/I$ and denote by $J$ the intersection of all Pfaffian ideals. What is $J$? It characterizes the largest subvariety of our matrix space where we can define something like a Pfaffian. And also a question: Can you name a Pfaffian ideal that does not contain the vanishing ideal of all skew-symmetric matrices? – Jesko Hüttenhain Nov 27 '18 at 23:25
  • 1
    @JeskoHüttenhain: What about the ideal defining $\operatorname{SL}_n\left(K\right)$? That would be Pfaffian, too. – darij grinberg Nov 28 '18 at 03:34
  • I don't know if this recent PhD thesis which uses pfaffians has some interest for you : https://www.lorentz.leidenuniv.nl/beenakkr/mesoscopics/theses/fulga/fulga.pdf – Jean Marie Nov 28 '18 at 10:42
  • Of course there is a lot of matrices (in general not antisymmetric) with their characteristic polynomial equal to a square : it suffices to take $P^{-1}diag(\lambda_1,\lambda_2,... \lambda_n,\lambda_1,\lambda_2,... \lambda_n)P$ for any $\lambda_1,\lambda_2,... \lambda_n$ and any invertible $P$ – Jean Marie Nov 28 '18 at 17:29
  • @JeanMarie I'm not sure if that's true in general. It is true that if $A= P^{-1} \text{diag}(\lambda_1,\ldots,\lambda_n,\lambda_1,\ldots,\lambda_n) P$ then $\det A = \prod_{i=1}^n \lambda_i^2$. Thus $\sqrt{\det A}$ is a polynomial in the eigenvalues, but not necessarily in the matrix elements. This would work if the eigenvalues were polynomials of the matrix elements, but that is of course not (in general) true. – UglyMousanova19 Nov 28 '18 at 20:20
  • You are right : "any invertible $P$" is much too optimistic : one must have a particular $P$ such as $P=I+N$ where $N$ is strictly upper triangular. I am going to explain what I mean under the form of an answer because it's too narrow here to write down matrices. – Jean Marie Nov 28 '18 at 20:44
  • I fact I haven't found anything interesting. Sorry – Jean Marie Nov 28 '18 at 21:59
  • @JeanMarie No problem, I really appreciate the help! – UglyMousanova19 Nov 29 '18 at 16:06

1 Answers1

1

Only a partial answer. The problem with defining the pfaffian of a matrix whose determinant is the square of the polynomial is that the sign of the pfaffian may be not well defined. For example, one may have two matrices $A$ and $B$ such that $det A=det B=(polynomial)^2$ but $pf A=-pf B$. A possible approach is to think in terms of unitary transformations and equivalence classes.

Let $A$ be a $2n\times 2n$ matric, not necessarily antisymmetric. Let $\mathcal U(A)$ be the set of all matrices $B=U A U^\dagger$ unitarily equivalent to $A$. Now consider the subset $\bar{\mathcal U}(A)\subset\mathcal U(A)$ of matrices which are antisymmetric. It is clear that all matrices in ${\mathcal U}$ have the same determinant, and all matrices in $\bar{\mathcal U}$ have the same pfaffian. Therefore one can define the pfaffian of any $A\in{\mathcal U}$ as the pfaffian of any $B\in\bar{\mathcal U}$. In short, one can define

$$pf A=pf (U A U^\dagger)$$

if there exists a unitary matrix $U$ such that $U A U^\dagger$ is antisymmetric. (The unitary matrix does not need to be uniquely defined, of course).

Maybe this definition is a little tautological but it makes sense from the point of view of physics, because unitary matrices are associated with (unitary) symmetries and therefore do not affect the value of measurable quantities. For example, the determinant of a Hamiltonian is an invariant quantity up to unitary symmetries. With the definition above, the pfaffian of a Hamiltonian has the same property.

sintetico
  • 432