0

I am reading a paper (on random matrix theory) and its using a lot of notation like this:

Given a Hermitian matrix $H$ and an approximate $\delta$ function $\theta_\eta(x) = \frac{\eta}{x^2 + \eta^2} = \Im \frac{1}{x - i \eta}$, we define $\theta_\eta(H)$, and I think the way the paper defines it is as $\Im \frac{1}{H - i \eta}$. In other words, given a function $f(x)$ in which it formally makes sense to replace the real variable with a matrix, we define the function $f(H)$ by formally replacing the $x$'s with $H$'s.

But then the paper also says things like $\chi(H)$, where $\chi$ is a characteristic function of an interval of the real line, and then also, $\chi * \theta_\eta(H)$. There is no way I can see formally to replace a real input with a matrix input in a characteristic function, so that I am left very confused.

Any assistance is appreciated.

EDIT: Here are the papers I am reading:

https://arxiv.org/pdf/1007.4652.pdf

https://arxiv.org/pdf/1102.0057.pdf

  • 1
    I strongly recommend you to give the link of the paper in the body of your question. Thanks – user0410 Apr 15 '20 at 18:26
  • 4
    In the space of Hermitian matrix is common to define $f(A)$, as $f(A)=Uf(D)U^{T}$, where $A=UDU^{T}$ is the spectral decomposition of $A$ and $f(D)=\text{diag}(f(\lambda_j))$ (remember that the eigenvalues $\lambda_j$ are real). As @user0410 recommended, it will be helpful to have the paper at hand with a link. – RLC Apr 15 '20 at 18:30
  • @RLC I think this did the trick, thank you. – Van Latimer Apr 15 '20 at 19:07
  • @RLC please consider turning your comment into an answer, so that OP may decide which answer to accept (your answer was much earlier, and is much more succinct). – Josse van Dobben de Bruyn Apr 16 '20 at 00:04
  • @JossevanDobbendeBruyn Thanks for the recommendation. – RLC Apr 16 '20 at 13:10

2 Answers2

1

Seeing as you also know some functional analysis, let me add another answer from the functional analytic perspective. In operator theory, this kind of construct is called functional calculus.

Construction (see for instance [Mur90, Thm 2.1.13], [Con07, Thm VIII.2.6], or any other textbook covering the basics of $C^*$-algebras).

Let $\mathcal A$ be a $C^*$-algebra, and let $a \in \mathcal A$ be a normal element ($aa^* = a^*a$). Let $C^*(a) \subseteq \mathcal A$ denote the $C^*$-subalgebra generated by $1$ and $a$, that is, the closed linear span of $\{a^k (a^*)^\ell \, : \, k,\ell \in \mathbb{N}_0\}$. Then $C^*(a)$ is a commutative $C^*$-algebra (but only because $a$ and $a^*$ commute), and one can show that its spectrum is homeomorphic with $\sigma(a)$. Therefore the Gelfand transform gives us an isometric $*$-isomorphism $C^*(a) \cong C(\sigma(a))$. The composition $$ C(\sigma(a)) \stackrel{\sim}{\longrightarrow} C^*(a) \hookrightarrow \mathcal A $$ defines a $*$-homomorphism $\varphi_a : C(\sigma(a)) \to \mathcal A$ with the following properties:

  • $\varphi_a$ is isometric;

  • $\text{ran}(\varphi_a) = C^*(a)$;

  • $\varphi_a(\Bbb{1}) = 1$;

  • $\varphi_a(z) = a$, where $z : \sigma(a) \to \mathbb{C}$ denotes the inclusion.

If $p \in \mathbb{C}[z,\overline{z}]$ is a polynomial in $z$ and $\overline{z}$, then $p(a,a^*)$ coincides with $\varphi_a(p)$, because $\varphi_a$ is a $*$-homomorphism and $\varphi_a(z) = a$. This leads to the following definition:

Definition (continuous functional calculus). If $f : \sigma(a) \to \mathbb{C}$ is any continuous function, then we define $$ f(a) := \varphi_a(f). $$ This makes it possible to "apply" any continuous function $f : \sigma(a) \to \mathbb{C}$ to $a$. To see that this makes sense, think of it in the following way: by the Stone–Weierstrass theorem, the polynomials in $z$ and $\overline{z}$ are dense in $C(\sigma(a))$, so the definition simply takes a continuous function $f : \sigma(a) \to \mathbb{C}$, approximates it uniformly with polynomials, and lifts the approximation to the $C^*$-algebra $\mathcal A$.

In the finite-dimensional case, the spectrum $\sigma(a)$ is simply the set of eigenvalues, and it is reasonably easy to see that the outcome of the above procedure does not differ from the answer given by RLC in the comments. Indeed, in this setting $\sigma(a)$ is a finite Hausdorff space, so every function $\sigma(a) \to \mathbb{C}$ is automatically continuous. Furthermore, every such function can be represented by an appropriate interpolating polynomial in $z$, so we don't even need approximation — one may determine $f(a)$ by choosing a polynomial $p \in \mathbb{C}[z]$ that agrees with $f$ on the eigenvalues of $a$, and subsequently setting $f(a) := p(a)$. This has, of course, the same effect as the procedure described by RLC.

To see just how powerful the $*$-homomorphism $\varphi_a$ is, I should point out that the finite-dimensional spectral theorem can be proved solely from the properties of $\varphi_a$ listed above; see the first part of this answer.


Finally, I should point that there are other types of functional calculus, apart from the "continuous" functional calculus outlined above.

  • If $\mathcal A$ is only a (complex) Banach algebra, then one has to replace the continuous functions by an appropriate space of holomorphic functions, and one obtains the holomorphic functional calculus. See also [Rud91, §10.21–10.32], or [Con07, §VII.4], or many other textbooks on functional analysis.

  • On the other hand, if $\mathcal A$ is the algebra $B(\mathcal H)$ of bounded linear operators on a complex Hilbert space $\mathcal H$, then one can even obtain a Borel functional calculus, making it possible to apply any bounded Borel measurable function $\sigma(a) \to \mathbb{C}$ to $a$. For instance, in functional analysis you may well find something like $\chi(A)$, where $\chi$ is the indicator function of some subset of $\mathbb{C}$ and $A$ is a (bounded) normal operator on an infinite-dimensional Hilbert space. For further reading, see the second part of this answer, or [Mur90, §2.5], or [Con07, Theorem IX.2.3], or even [Rud91, §12.24].¹

¹: I don't recommend Rudin for operator theory, as there are much clearer texts, but the results are there.


References.

[Con07]: John B. Conway, A Course in Functional Analysis, Second Edition (2007), Graduate Texts in Mathematics 96, Springer.

[Mur90]: Gerard J. Murphy, $C^*$-algebras and operator theory (1990), Academic Press.

[Rud91]: Walter Rudin, Functional Analysis, Second Edition (1991), McGraw–Hill.

1

Define $\mathcal{H}_n$ as the space of $n\times n$ Hermitian matrices and for $A\in\mathcal{H}_n$ denote $\lambda_1(A)\geq \cdots\geq \lambda_n(A)$ its eigenvalues (all are real since $A$ is Hermitian). Also, denote $S(A)=\{\lambda_1(A),...,\lambda_n(A)\}$.

For any $A\in\mathcal{H}_n$ define a function $f$ on $S(A)$. The matrix $f(A)\in\mathcal{H}_n$ is defined as $$ f(A) = Pf(D)P^{\top}, $$ where $PDP^{\top}$ is the spectral decomposition of $A$ and $$ D = \text{diag}(f(\lambda_1(A)),...,f(\lambda_n(A))). $$ In this way, you can define for example the matrices $e^A$, $\log(A)$ (for $A>0$) or $(I+A)^{-1}$.

RLC
  • 1,426