1

If $A$ is such that if $u$ and $v$ are orthogonal than $A(u)$ and $A(v)$ are also orthogonal (that is, their scalar multiple is $0$) for any $(u,v)$ pair and $A \in Hom (V,V)$ where V is a complex vector space

then prove that A is a scalar (real,non-negative) multiple of a unitary transformation(so $A=\lambda D$ where $D^*D=I$ and $\lambda \ge0$ and $\lambda \in \mathbb R$).

Solumilkyu
  • 3,529
Nesa
  • 1,295
  • 11
  • 23
  • What did you try ? – paf May 18 '16 at 17:22
  • 1
    In an infinite dimensional space I think that $D$ is only proven to be isometric (not necessarily surjective, and therefore not unitary) and requires $A$ to be continuous in order to assure existence of the adjoint. – Tom Collinge Feb 26 '17 at 14:34

2 Answers2

3

Set $B = A^{*}A$. Take $v \in V$ and $w \perp v$. Then

$$ \left<Bv, w \right> = \left<A^{*}Av, w \right> = \left<Av, Aw \right> = 0 $$

which implies that $Bv \perp \operatorname{span} \{ v \}^{\perp}$ and so $Bv \in \operatorname{span} \{ v \}$. That is, each $v \in V$ is an eigenvector of $B$. Show that this implies that $B$ must be a constant multiple of the identity and write $B = \mu I$ for some $\mu \in \mathbb{C}$. Then

$$ \left< Av, Aw \right> = \left<A^{*}Av, w \right> = \left< Bv, w \right> = \left< \mu v, w \right> = \mu \left<v, w \right>.$$

In particular if you take $v = w \neq 0$, you see that $\mu$ must be real and non-negative. If $\mu = 0$, the equation above shows that every two vectors in the image of $A$ are orthogonal and so $\dim \operatorname{Im} A = 0$ and $A = 0$. If $\mu \neq 0$, then set $\lambda = \sqrt{\mu}$ and obtain

$$ \left< \frac{A}{\lambda}v, \frac{A}{\lambda}w \right> = \left<v ,w \right> $$

and so $\frac{A}{\lambda}$ is orthogonal and $A = \lambda \cdot \frac{A}{\lambda}$.


Alternatively, use polar decomposition to write $A = UD$ where $U$ is unitary and $D$ is positive. We want to show that $D = \lambda I$ for some $\lambda \geq 0$. Let $v,w$ be two orthogonal unit-length eigenvectors of $D$ and write $Dv = \mu v$, $Dw = \nu w$ with $\mu, \nu \geq 0$. Then

$$ \left< v + w, v - w \right> = \left< v, v \right> - \left< v, w \right> + \left< w, v \right> - \left< w, w \right> = 1 - 1 = 0 \implies \\ \left< A(v + w), A(v - w) \right> = \left< Av, Av \right> - \left< Av, Aw \right> + \left< Aw, Av \right> - \left< Aw, Aw \right> = \\ \left< (UD)v, (UD)v \right> - \left< (UD)w, (UD)w \right> = \left< Dv, Dv \right> - \left< Dw, Dw \right> = \mu^2 - \nu^2 = 0 $$

and so $\mu = \nu$. This shows that all the eigenvalues of $D$ must be the same and so $D = \lambda I$ for some $\lambda \geq 0$.

levap
  • 67,610
  • $\left<A^{*}Av, w \right> = \left<Av, Aw \right>$ why is this true? – Nesa May 18 '16 at 20:26
  • If there exists a linear operator $B$ on $V$ such that $\langle A(x),y\rangle=\langle x,B(y)\rangle$ for all $x,y\in V$, we usually call $B$ the adjoint of $A$, and denote $A^\ast=B$. Therefore if $A^\ast$ exists, then $\langle A(x),A(y)\rangle=\langle A^\ast A(x),y\rangle$ is always true. – Solumilkyu May 18 '16 at 20:56
  • 1
    Notice that $A^\ast$ exists if $V$ is finite-dimensional, but not guaranteed if $V$ is infinite-dimensional unless we assume it exists. – Solumilkyu May 18 '16 at 21:21
0

If $V=\{{\it 0}\,\}$, then the result is trivial, so we consider $V\ne\{{\it 0}\,\}$. Fix a non-zero vector $u\in V$ and let $W=\operatorname{span}(\{u\})$. Then for $v\in W^\perp$, $$\langle A^\ast A(u),v\rangle=\langle A(u),A(v)\rangle=0.$$ It follows that $A^\ast A(u)\in W^{\perp\perp}=W$, that is, $A^\ast A(u)=\lambda u$ for some scalar $\lambda\in\mathbb{C}$.

Next, we show that $\lambda\ge 0$. Because $u$ is non-zero, $\langle u,u\rangle>0$. Thus \begin{align} \lambda\langle u,u\rangle =\langle\lambda u,u\rangle =\langle A^\ast A(u),u\rangle =\langle A(u),A(u)\rangle \ge0, \end{align} that is, $\lambda=\displaystyle\frac{\langle A(u),A(u)\rangle}{\langle u,u\rangle}\ge 0$ and hence $\lambda\in\mathbb{R}$.

Finally, we show that $A^\ast A(x)=\lambda x$ for all $x\in V$. Since the result follows immediately when $x={\it 0}$, we only need to consider $x\ne{\it 0}$. Furthermore, by the preceding result, we can write $A^\ast A(x)=\lambda'x$ for some scalar $\lambda'\ge0$, so it suffices to show that $\lambda'=\lambda$. For one case that $x\notin W^\perp$, we have $\langle x,u\rangle\ne 0$ and \begin{align} \lambda'\langle x,u\rangle =\langle\lambda' x,u\rangle =\langle A^\ast A(x),u\rangle =\langle A(x),A(u)\rangle =\langle x,A^\ast A(u)\rangle =\langle x,\lambda u\rangle =\lambda\langle x,u\rangle. \end{align} It follows that $\lambda'=\lambda$. For another case $x\in W^\perp$, by using the previous case and taking the vector $x+u\notin W^\perp$, we see that \begin{align} A^\ast A(x)=A^\ast A(x+u)-A^\ast A(u)=\lambda (x+u)-\lambda u=\lambda x. \end{align} Hence we conclude that $A^\ast A=\lambda I$, that is, $A$ is a $\sqrt{\lambda}$ multiple of an unitary transformation, where $\sqrt{\lambda}$ is a positive real number.

Solumilkyu
  • 3,529