1

Consider the set $D$ of square diagonal matrices (ideally complex but positive reals would also be interesting). We have a function $f$ on this set that fulfills for all $A, B \in S$: $$f(AB) = f(A)f(B)$$ What does this tell us about the form of $f$? I hypothesize that it will be a product of powers of diagonal entries: $$f(A) = \prod_{i=1}^n A_{ii}^{c_i}$$ This hypothesis comes from the theorem for real (?) numbers that if $f(ab)=f(a)f(b)$, $f$ has the form $f(x)=x^a$ for some $a$. (Overview of basic facts about Cauchy functional equation)

Jannis
  • 247

1 Answers1

1

Do note that you need continuity or at least some "niceness" condition in the 1D case to conclude that a solution of $h(x)h(y) = h(xy)$ is of the form $h(x) = x^a$, because I assume that just like for the additive Cauchy functional equation there will be other types of solutions, which are probably so highly irregular ("graph is dense in $\mathbb{R}^2$"-irregular in the additive case for example) that it would be impossible to describe them in a simple manner...
I'll handle the positive real case instead of the complex one for simplicity and because the proof of the 1D case I have in mind takes a logarithm, which is always complicated in the complex world, but it's probably fine to assume it would be mostly the same in $\mathbb{C}$.

We shall proceed by induction on the size $n$ of the matrices: denote by $D_n$ the space of $n \times n$ diagonal matrices with real non-negative entries, and let's prove that the property $P(n)$: "A multiplicative function $f : D_n \to [0, +\infty)$ which is also componentwise continuous is necessarily of the form $f : A \mapsto \prod_{j = 1}^n A_{j,j}^{c_j}$ with $c_j \geq 0$" is true for all $n \in \mathbb{N}$, $n \geq 1$.

Initialisation: $n = 1$: I'll redirect to If $f(xy)=f(x)f(y)$ then show that $f(x) = x^t$ for some t as this is not the main interest of the post, however it is one of the reasons the continuity assumption is important.

Induction step: assume $P(n-1)$ is true for some fixed $n \geq 2$. Let's show that $P(n)$ is true too.
Consider a multiplicative function $f : D_n \to [0, + \infty)$ which is both multiplicative and componentwise continuous.
For $M \in D_{n-1}$, define $\tilde{M} := \begin{pmatrix} M & 0 \\ 0 & 1\end{pmatrix}\in D_n$, and then: $$g : M \in D_{n-1} \mapsto f\left(\tilde{M}\right) \in [0, +\infty)$$ Since $P(n-1)$ is assumed true and $g$ is multiplicative and componentwise continuous because $f$ is, there exists $(c_j)_{1 \leq j \leq n - 1} \subset [0, +\infty)$ such that: $$\forall M \in D_{n-1},\quad g(M) = \prod_{j = 1}^{n-1} M_{j,j}^{c_j}$$ On the other hand, one may define the following: $$h : x \in [0, +\infty) \mapsto f(\operatorname{diag}(1, \cdots, 1, x)) \in [0, +\infty)$$ By the assumptions on $f$, $h$ is continuous and multiplicative, hence it is of the form $h(x) = x^{c_n}$ for some $c_n \geq 0$.
However, every matrix $A \in D_n$ is of the form $A = \tilde{M_A} \cdot \operatorname{diag}(1, \cdots, 1, A_{n,n})$ where $M_A := (A_{i,j})_{1 \leq i,j \leq n-1}$, therefore we get: $$f(A) = f\left(\tilde{M_A}\right) f(\operatorname{diag}(1, \cdots, 1, A_{n,n})) = g(M_A) h(A_{n,n}) = \left(\prod_{j = 1}^{n-1} A_{j,j}^{c_j}\right) A_{n,n}^{c_n} = \prod_{j = 1}^n A_{j,j}^{c_j}$$ and $P(n)$ is true if $P(n-1)$ is.

Conclusion: $P(n)$ is true for all $n \geq 1$, and thus your claim has been proved.

Bruno B
  • 7,655