3

This is Exercise 2.2 "Trivial fact of life" from Greg Moore and Nathan Seiberg's Lectures on RCFT, but as far as I can tell it's purely a complex analysis problem. The question goes:

Suppose we have four sets of linearly independent complex analytic (i.e. holomorphic) functions $\{f_i\}_{i=1}^N$, $\{g_i\}_{i=1}^N$, $\{h_i\}_{i=1}^M$, and $\{k_i\}_{i=1}^M$, such that $$ \sum_{i=1}^N f_i \bar{g}_i = \sum_{i=1}^M h_i \bar{k}_i.$$ Prove that $N=M$, and furthermore that there exists an invertible matrix $A$ such that $\vec{f} = A \vec{h},\quad$ $\vec{g} = (A^{-1})^\dagger \vec{k}$.

In the case $N=M=1$, we have $f(z)\bar{g}(\bar{z}) = h(z)\bar{k}(\bar{z})$, and dividing to isolate the meromorphic and antimeromorphic dependence gives us that $f(z)/h(z) = \bar{k}(\bar{z})/\bar{g}(\bar{z}) = c$, a constant (which can be seen by taking $z$ or $\bar{z}$ derivatives). Thus we are done.

I'm having trouble dealing with the linear independence condition in general, as well as the complication that $z$ and $\bar{z}$ are not completely "independent variables," in the sense that we can't arbitrarily tune one without the other. In the case of completely independent real variables, this thread provides most of the answer (from their proof we can conclude that e.g. $\{g_i\}$ and $\{p_i\}$ by their notation have the same span, and everything else pretty much follows). Can that logic be rigorously extended to this case?

Frederik vom Ende
  • 5,187
  • 1
  • 11
  • 39
tundra
  • 33
  • Although this looks like a complex analysis problem, the complex analysis is a red herring (is misdirection). You have a set of $N$ vectors, $f_i$, in a vector space, $N$ vectors, $g_i$, in its dual space and similarly $M$ vectors and dual vectors, $h_i$ and $k_i$, respectively. The argument is entirely linear algebra (arguing about dimensions of finite dimensional spaces and changes of basis in that finite dimensional space). – Eric Towers Apr 29 '25 at 01:09
  • Thanks for your answer! In what sense are the $g_i$ elements of the dual space? How can I interpret them as linear functionals sending $f_i$ to the complex numbers? (Edit: I guess this is true for any fixed $z$. I'll think about it...) – tundra Apr 29 '25 at 01:12
  • "$N$ vector" as in $(f_1, \dots, f_N)$ and "dual $N$ vector" as in $(g_1, \dots, g_N)$. Maybe I should have typed "$N$-vector". Then $\vec{f} \cdot \vec{g} = \sum_{i=1}^N f_i \overline{g}_i$. – Eric Towers Apr 29 '25 at 01:15
  • This is as far as I had gotten: we can pick $z_1,\ldots, z_N$ such that the vectors given by $\vec{f}(z_j)$ are independent. If I could somehow "fix" these values of $z$ while letting $\bar{z}$ be free, then I could conclude that the span of the $\bar{\vec{g}}$ is contained in the span of the $\bar{\vec{h}}$, and vice versa. Then everything follows. Indeed, this is how it proceeds in the thread linked above where we have independent variables $x,y$. Doesn't this break in the case where our variables are $z$ and $\bar{z}$? – tundra Apr 29 '25 at 01:30
  • The $f_i$ are not "linearly independent at a point". They are "linearly independent", meaning there are no $\alpha_i \in \Bbb{C}$ such that $\sum_{i=1}^N \alpha_i f_i = 0$ except for the trivial assignment $\alpha_i = 0$. – Eric Towers Apr 29 '25 at 01:32
  • I understand, I was using the fact from this thread to pick out specific points so that I could just use linear algebra in $\mathbb{C}^N$ and treat this literally as a dot product. I guess you're saying this isn't necessary, so I'll go back to the drawing board. Thanks for your help. – tundra Apr 29 '25 at 01:38
  • So, you have the $f_i$, $N$ linearly independent "vectors" in the vector space of holomorphic functions. They span an $N$-dimensional subspace of the space of holomorphic functions. Similarly the $h_i$ span an $M$-dimensional subspace. – Eric Towers Apr 29 '25 at 01:40
  • The $g_i$ span an $N$-dimensional subspace of the intersection of (the dual to the space of holomorphic functions) intersect (the antiholomorphic functions). Similarly for the $k_i$ for an $M$-dimensional subspace. (It turns out the dual contains all the antiholomorphics, and maybe more things we need not worry about.) – Eric Towers Apr 29 '25 at 01:44
  • The equality tells us the various spans intersect. If they don't intersect, nothing in the span of the $f_i$ can be in the span of the $h_i$ and vice versa and mutis mutandi for $g_i$ and $k_i$. – Eric Towers Apr 29 '25 at 01:48
  • Previously you treated the functions as components of vectors $(f_1, \ldots, f_N)$, which at each $z\in\mathbb{C}$ is a vector in $\mathbb{C}^N$. Now it seems like you are treating each $f_i$ itself as a vector in the space of holomorphic functions. This seems reasonable to me, except that if we're working over the field $\mathbb{C}$, I don't see a natural way of treating each individual $g_i$ as a dual vector to the space of holomorphic functions. The product $f_i \bar{g}_i$ is a complex function of $z$, and not a complex number. – tundra Apr 29 '25 at 02:42
  • I agree that the spans intersect, but to prove that they are necessarily the same means finding $N$ linearly independent vectors shared between the spans. This was why I picked out those specific $z_j$ before (to compare the span of the $\bar{g}_i$ and that of the $\bar{k}_i$). – tundra Apr 29 '25 at 02:43
  • The crucial result needed here is that for analytic functions linear independence is equivalent to the nonvanishing of the Wronskian (as an analytic function - it may have individual zeroes of course); noting that differentiating the relations above keeps the conjugate analytic functions invariant and using conjugate differentiation keeps the analytic functions invariant, one can separate the functions as in the $N=M=1$ case and get that various matrices are both analytic and conjugate analytic hence constant etc – Conrad Apr 29 '25 at 06:10

1 Answers1

4

Let $A_k(f_1,..f_N)$ the $k \times N$ matrix with rows the derivatives $f_1^{(l)},...f_N^{(l)}, n=0,...k-1$ of the functions where as usual the zeroth derivative is the function itself.

The special case $k=N$ gives rise to a square matrix whose discriminant is called the Wronskian of the functions. It is obvious that if the functions are linearly dependent the Wronskian is identically zero, while a fundamental theorem is that if the functions are linearly independent on some domain, the Wronskian is not identically zero (though of course, it may have zeroes as an analytic function).

Note that this result holds only for analytic functions in general as for example compactly supported smooth functions on the real line with disjoint support are linearly independent but have identically zero Wronskian.

The proof is not hard - assuming the Wronskian vanishes identically, we pick a maximal subset of functions for which the corresponding Wronskian doesn't vanish (since the functions are linearly independent, none is identically zero and the Wronskian of one function is equal to the function so any one them has non zero Wronskian by itself etc), hence there is a small open ball where that Wronskian has no zeroes by continuity and then a simple manipulation shows that any other function in the set is a linear combination of those on that small open ball, hence everywhere by analytic continuation.

Coming back to the problem and differentiating the relation $$\sum_{i=1}^N f_i \bar{g}_i = \sum_{i=1}^M h_i \bar{k}_i$$ $k-1$ times and conjugate differentiating it $l-1$ times and noting that differentiation keeps $\bar g_j, \bar k_j$ invariant and conjugate differentiation keeps $f_j, h_j$ invariant we get the relation (both sides being $k\times l$ matrices) $$A_k(f_1,...f_N)\overline {A^t_l(g_1,..g_N)}=A_k(h_1,...h_M)\overline {A^t_l(k_1,..k_M)}$$

Now by the above the discriminants of $A_{N-1}(f_1,...f_N), \overline {A^t_{M-1}(k_1,..k_M)}$ do not vanish identically so we can invert the corresponding matrices (in the realm of meromorphic/conjugate meromorphic functions of course) and we get (both sides being now $N \times M$ matrices) $$\overline {A^t_{M-1}(g_1,..g_N){A^t_{M-1}(k_1,..k_M)}^{-1}}=A_{N-1}^{-1}(f_1,...f_N)A_{N-1}(h_1,...h_M)=B_{N,M}((a_{nm}))$$

Since the LHS is a matrix of conjugate meromorphic functions and the RHS is a matrix of meromorphic functions, the equality means that $B_{N,M}((a_{nm}))$ is a constant matrix and since $$A_{N-1}(h_1,...h_M)=A_{N-1}(f_1,...f_N)B_{N,M}$$ we get that $(h_1,..h_M)$ is a linear combination of $(f_1,..f_N)$. Of course the reverse is also true by symmetry so we get that $N=M$ (since both sets are linearly independent) and that the matrix $B_{N,M}$ is invertible and (more or less) the required $A$ of the OP, while the relation for the other set is immediate by transposing and conjugating the equality $$\overline {A^t_{M-1}(g_1,..g_N){A^t_{M-1}(k_1,..k_M)}^{-1}}=A$$ and we are done!

Conrad
  • 31,769