1

Terminology:

Suppose $\gamma_1(0)=\gamma_2(0)=z_0$ then we define the (orientated) angle between $\gamma_1$ and $\gamma_2$ at $z_0$ to be $\arg\gamma_1'(0)-\arg\gamma_2'(0).$ (Provided neither differential are $0$ here)

Question:

Suppose $f:\mathbb{R}^2\to \mathbb{R}^2$ is a map with continuous real partial derivatives which preserves angles and orientations then would it necessarily be holomorphic? (Holomorphism as in viewed as a complex function)

My attempt:

Ideally if we can show $f$ satisfies CRE then we are done. Since it is angle (orintation) -preserving, we have $$\arg(\frac{(f\gamma_1)'(0)}{(f\gamma_2)'(0)})=\arg(\frac{\gamma_1'(0)}{\gamma_2'(0)}).$$ Then I think this implies $\frac{(f\gamma_1)'(0)}{(f\gamma_2)'(0)}=\lambda\frac{\gamma_1'(0)}{\gamma_2'(0)}$ for some $\lambda>0$?

Then (I am not uncertain with this part), I think $(f\gamma_1)'(0)=(U_x(z_0)+iV_x(z_0))\gamma_1'(0)?$ Then do the same with $f\gamma_2$ but use $U_y, V_y$ instead. (Where $f(x+iy)=U+iV$ where $U$ and $V$ are real valued functions.) Then I basically have CRE with some multiple $\lambda.$

I am very uncertain with my solution, however I believe the claim should be true, how could I prove this assertion? Many thanks in advance!

  • I suppose this is in reference to my comment to your answer linked below? Anyway, I think you'll have an easier time if you characterize angle preserving maps via the standard inner product on $\mathbb R^2$. It plays nicely with linear maps, and in the end, you do want to show that $f$ can be approximated by a specific type of linear map (a conformal one). https://math.stackexchange.com/questions/3799572/proof-that-every-conformal-function-f-ne-0-is-holomorphic – Vercassivelaunos Aug 24 '20 at 12:22
  • @Vercassivelaunos Hahaha Hi again! No, it wasn't in reference to your comment. This is just a coincidence that I encountered the same problem whilst trying to prepare for my exam. Thank you for the hint though, but since my course did not introduce the notion of differentiability from a linear map, I was looking for something perhaps more basic. :) – UnsinkableSam Aug 24 '20 at 12:32
  • You still have to show that $f$ can be approximated by a linear map, though. A map satisfying the CR equations isn't enough. It also has to be real differentiable, which means that it can be approximated by a linear map. Or did you define real differentiability differently? (I remember a complex analysis course where it was defined a bit weirdly, without reference to linear maps, but I don't think that's standard) – Vercassivelaunos Aug 24 '20 at 12:37
  • My course didnt properly say what real differentiability of a complex mean, I always take it as $u_x, u_y, v_x, v_y$ exists at that point where it is real differentiable. (where u, v are given as in the main text.) How come a map satisfying CRE is not enough though? (I thought there was a result due to Goursat saying if a complex function has continuous real partial derivatives and the derivatives satisfy CRE then it is complex differentiable?) Thank you for helping! – UnsinkableSam Aug 24 '20 at 12:46
  • A complex function is complex differentiable iff it is real differentiable and satisfies the CR equations. If the partial derivatives exist and are continuous in an open set (not just in a single point), then that is a sufficient condition for real differentiability in that open set. Which is why continuity of the partial derivatives in a neighborhood of a point + the CR equations are sufficient for complex differentiability. – Vercassivelaunos Aug 24 '20 at 16:42
  • @Vercassivelaunos Ah that clears up my other confusion I had whilst doing this problem. Apologies in advance, though, if I am just being slow right now. I think since my question assumed to have continuous real partial derivatives, showing $f$ satisfies CRE is sufficient right? Or are we supposed to show $f$ can be approximated by a linear map still? Thank you for helping! – UnsinkableSam Aug 24 '20 at 20:32
  • If the partial derivatives are continuous everywhere, then yes, verifying the Cauchy-Riemann equations is sufficient. – Vercassivelaunos Aug 24 '20 at 20:38
  • @Vercassivelaunos Makes sense! So I guess my eventual question is did you show it satisfies CRE using the standard definition? (So in that case, finding a linear map with its Jacobian being in the 'right' form, like an antisymmetric matrix with the same diagonal) Or do you reckon it was possible to show $f$ satisfies CRE without doing so? – UnsinkableSam Aug 24 '20 at 20:45
  • I'm not sure what you're referencing with "it" right now. A holomorphic function or an angle+orientation-preserving one? – Vercassivelaunos Aug 24 '20 at 20:59
  • @Vercassivelaunos Sorry for the confusion, 'it' is used to refer as $f$ in the main text, that is, continuous real partial derivatives, angle +orientation preserving one. Many thanks! – UnsinkableSam Aug 24 '20 at 21:04
  • Personally, I only know the proof via the Jacobian with the inner product. If you want to, I can try throwing together that one as an answer, because I don't see a substantially different way to do it. It might be doable via the argument anyway, but you should take a bit more care: $\operatorname{arg}(a)+\operatorname{arg}(b)=\operatorname{arg}(ab)$ isn't true for all $a$ and $b$. You have to allow for possible integer multiples of $2\pi$ to be added on one side of the equation (doesn't matter which). – Vercassivelaunos Aug 24 '20 at 21:17
  • @Vercassivelaunos Yeah that is true. Please do throw an answer down below though, it may be helpful for future readers. (Also in that case I can accept your answer, this may be the only way I can show my appreciation :) ) – UnsinkableSam Aug 24 '20 at 21:21

1 Answers1

1

To start, I'm going to characterize angle- and orientation-preserving slightly differently. I will say that a function is angle preserving if

$$\frac{\langle(f\gamma_1)'(0),(f\gamma_2)'(0)\rangle}{\vert(f\gamma_1)'(0)\vert\vert(f\gamma_2)'(0)\vert}=\frac{\langle\gamma_1'(0),\gamma_2'(0)\rangle}{\vert\gamma_1'(0)\vert\vert\gamma_2'(0)\vert}.$$

It's what you get if you characterize the angle via the law of cosines, $\langle v,w\rangle=\vert v\vert\vert w\vert\cos\theta$, where $\theta$ is the (non-oriented) angle between $v$ and $w$. The above equation basically says that the cosine of the angle is preserved. And I call a map orientation preserving if for all $\gamma_1,\gamma_2$ whose derivatives at $0$ are linearly independent, the linear map characterized by $\gamma_1'(0)\mapsto(f\gamma_1)'(0)$ and $\gamma_2'(0)\mapsto(f\gamma_2)'(0)$ has positive determinant. This is because the determinant has the orientation of an ordered set of vectors baked into its sign by construction.

With these preliminaries out of the way, we can first notice that $f$ is real differentiable because it has continuous partial derivatives in an open set according to your premises. So $f$ has a differential $\mathrm Df$, and we can use the chain rule to obtain

$$(f\gamma_i(0))'=\mathrm (Df)(\gamma_i(0))\cdot\gamma_i'=\mathrm Df(z_0)\cdot\gamma_i'.$$

So we have

$$\frac{\langle \mathrm Df(z_0)\cdot\gamma_1'(0),\mathrm Df(z_0)\cdot\gamma_2'(0)\rangle}{\vert\mathrm Df(z_0)\cdot\gamma_1'(0)\vert\vert\mathrm Df(z_0)\cdot\gamma_2'(0)\vert}=\frac{\langle\gamma_1'(0),\gamma_2'(0)\rangle}{\vert\gamma_1'(0)\vert\vert\gamma_2'(0)\vert}.$$

And this holds for all smooth paths $\gamma_1,\gamma_2$, and $\gamma_{1/2}'(0)$ can be any vector except $0$. So $\mathrm Df(z_0)$ belongs to a class of linear maps $L:\mathbb R^2\to\mathbb R^2$ which satisfy

$$\frac{\langle Lv,Lw\rangle}{\vert Lv\vert\vert Lw\vert}=\frac{\langle v,w\rangle}{\vert v\vert\vert w\vert},\qquad\textrm{for all }v,w\in\mathbb R^2\backslash\{0\}.$$

Such linear maps are called conformal if they are orientation preserving and anticonformal if they are not. We will now find out how conformal maps look. First thing, we will show that $\vert Lv\vert$ is proportional to $\vert v\vert$. In particular, $\vert Lv\vert=\lambda\vert v\vert$ for some $\lambda\in\mathbb R$, independent of what $v$ actually is. For proof, consider two linearly independent vectors $v,w$ with $\vert v\vert=\vert w\vert$. We will use the fact that $v+w$ and $v-w$ are orthogonal to show that $\vert Lv\vert=\vert Lw\vert$. It holds that

$$\frac{\langle L(v+w),L(v-w)\rangle}{\vert L(v+w)\vert\vert L(v-w)\vert}=\frac{\langle v+w,v-w\rangle}{\vert v+w\vert\vert v-w\vert}.$$

Juggling a bit with the rules for inner products and linear maps we get

$$\frac{\langle Lv,Lv\rangle-\langle Lw,Lw\rangle}{\vert L(v+w)\vert\vert L(v-w)\vert}=\frac{\langle v,v\rangle-\langle w,w\rangle}{\vert v+w\vert\vert v-w\vert}.$$

Since $\vert v\vert^2:=\langle v,v\rangle$, and by assumption $\vert v\vert=\vert w\vert$, the right side is $0$. But then the left side is also $0$, meaning that

$$\begin{align*}\langle Lv,Lv\rangle&=\langle Lw,Lw\rangle\\ \vert Lv\vert^2&=\vert Lw\vert^2\\ \vert Lv\vert&=\vert Lw\vert. \end{align*}$$

And since we chose $v,w$ in arbitrary direction, $\vert Lv\vert$ can thus only depend on $\vert v\vert$. For a linear map, this means $\vert Lv\vert=\lambda\vert v\vert$ for some $\lambda\geq0$. Also note that $\lambda\neq0$ because otherwise the defining equation could never be fulfilled, since we would have to divide by $0$. Now with this information, the defining equation becomes

$$\begin{align*}\frac{\langle Lv,Lw\rangle}{\lambda\vert v\vert\lambda\vert L\vert}&=\frac{\langle v,w\rangle}{\vert v\vert\vert w\vert}\\ \frac{\langle \lambda^{-1}Lv,\lambda^{-1}Lw\rangle}{\vert v\vert\vert w\vert}&=\frac{\langle v,w\rangle}{\vert v\vert\vert w\vert}\\ \langle\lambda^{-1}Lv,\lambda^{-1}Lw\rangle&=\langle v,w\rangle. \end{align*}$$

This makes $\lambda^{-1}L$ an orthogonal transformation! And it's already known how orthogonal transformations on $\mathbb R^2$ look (the Wikipedia article on orthogonal matrices has descriptions). The ones with positive determinant (so the orientation preserving ones) look like this:

$$\begin{pmatrix}a&-b\\b&a\end{pmatrix},\qquad a^2+b^2=1.$$

And if $\lambda^{-1}L$ looks like that, then $L$ looks like this:

$$\begin{pmatrix}a&-b\\b&a\end{pmatrix},\qquad a^2+b^2=\lambda^2.$$

So that's how $\mathrm Df$ looks. You can immediately read out the Cauchy-Riemann equations from the components now, and also note that the condition $a^2+b^2=\lambda^2\neq0$ enforces that $\mathrm Df\neq0$, which translates to $f'\neq0$. Though by the way, the Cauchy-Riemann equations are just a means to verify that $\mathrm Df$ is of the above form, which is the actual condition needed for holomorphy. So we could also skip the CR equations and just directly say that $\mathrm Df$ is of the form required for holomorphy.

  • Why can we conclude from your definition that f has partial continuous derivatives? – Fernando Nazario Apr 16 '22 at 22:40
  • @FernandoNazario: I don't think we can. Continuous partial derivatives are a premise stated in the question, not something that follows from my definition. Maybe I should have stated this more clearly. – Vercassivelaunos Apr 17 '22 at 05:25