1

Let $M=f^{-1}(\{0\})$ be a $d$ dimensional manifold situated in $\mathbb{R}^D$, where $D>d$. We assume $f:\mathbb{R}^D\to \mathbb{R}^p$ where $p,D,d$ are related by $p=D-d$. For those interested in regularity, we assume $f$ is $C^\infty$ and its Jacobian has full rank everywhere on $M$.

We are trying to find explicit formulas for the orthogonal projection matrix from $\mathbb{R}^D$ onto the tangent space of $M$ at a given point $x\in M$. Explicitly we have the following

Question: What are explicit formulas for the orthogonal projection matrix $P: \mathbb{R}^D \to \mathbb{R}^{D\times D}$ where $P(x)P(x)=P(x)^2=P(x)$.

What I know: If $d=D-1$, i.e. $p=1$, then the following form is rather well known: $$P(x)=I-n(x)n(x)^T,$$ where $n$ is the normal vector to $M$, $$n(x)=\frac{\nabla f(x)}{\|\nabla f(x)\|}.$$

One way to derive this is is to write each point of $M$ as $x=(y, F(y))$ where $y\in \mathbb{R}^{D-1}$ and $F: \mathbb{R}^{D-1}\to \mathbb{R}$. The relationship between $f$ and $F$ is as follows: $$f^i(x_1,\dotsc, x_D)=F^i(x_1,\dotsc, x_d)-x^{d+i}.$$ Then the intrinsic metric tensor is $g(y)=I+J_F^T J_F$ where $I$ is $d\times d$ identity matrix and $J_F$ is the $1\times d$ Jacobian matrix of $F$. Then, $$\tilde{P}(y) = (I, J_F^T)^T g^{-1}(y) (I, J_F^T),$$ defines an orthogonal projection via $P(x)=\tilde{P}(\pi(x))$ where $\pi$ is the projection of the first $d$ coordinates. Writing out the computations explicitly yields the above form of $P$, it is only a little tedious.

Unfortunately, I am having trouble generalizing this for $d=D-p$, $p>1$. My guess is that we should have something analogous like $P=I-J_f^T J_f$ (note $J_f$ is $p\times D$ so this product, with $I$ a $D\times D$ identity, at least gives the right dimensions of $D\times D$). But I've checked for some examples, it is not correct. We are clearly missing the normalization factor present in the $I-nn^T$ hypersurface case. I apologize that my vector calculus/differential geometry is a bit weak, so if this is well known, I'd gladly take references to read into.

Update: Thought about it a little and realized $P=I_{D\times D}-A^TA$, where $A$ is $p\times D$ is an orthogonal projection if and only if $AA^T=I_{p\times p}$. Since $A$ is rectangular, this means that the rows of $A$ must be orthonormal. So for $A=J_f$ this means we need to orthonormalize the rows. Then we should have our expression for $P$. I'm going to try to work out all the details. Comments/corrections on this approach are welcome.

Nap D. Lover
  • 1,292
  • 1
    You made a good observation in your update. Here is something else to consider. Note that $f(x) = (f_1(x), f_2(x), \dotsc, f_p(x))$ where $f_k : \mathbb R^D \to \mathbb R$. In light of this, what does it mean if $f(x) = 0$, and how does this relate to $M$? How does $\nabla f_k$ relate to $M$? Additionally, how is $\nabla f_k$ related to $J_f$? – Nicholas Todoroff Dec 13 '22 at 02:53
  • @NicholasTodoroff If $f(x)=0$ then $x$ is on the manifold $M$ (also precisely it means that $F_i(x_1,\dotsc, x_d)=x^{d+i}$ for $i=1,\dotsc, p=D-d$). I am not entirely sure how $\nabla f_k$ relates to $M$. But I do know that $(J_f)_{kj} = \frac{\partial f_k}{\partial x_j}$, i.e. I am taking the convention that the $k$-th row of $J_f$ is $\nabla f_k^T$. I feel like I am missing something obvious. But I can say, from my update and your comment, we need to orthonormalize the gradients $\nabla_k f$. – Nap D. Lover Dec 13 '22 at 04:36
  • 1
    Let me be a little more explicit then. The set of all $x$ such that $f_k(x) = 0$ is a manifold $M_k$ with dimension $D - 1$. But $f(x) = 0$ iff $f_k(x) = 0$ for all $k$. So what is the relationship between $M$ and all the $M_k$? Having answered this question, we also note that $\nabla f_k(x)$ is a vector normal to $M_k$ at $x$; what does this say about how $\nabla f_k$ relates to $M$? – Nicholas Todoroff Dec 13 '22 at 07:23
  • Ah, so then we can say $M=\cap_{k=1}^p M_k$, right? And if $x\in M$ then $x \in M_k$ for each $k$ and so since $\nabla f_k(x)$ is normal to $M_k$ at $x$ we have $\nabla f_k(x)^T x =0$, for each $k$. Thus $\nabla f_k(x)$ is normal to $M$ at $x$, for each $k$. Am I on the right track? – Nap D. Lover Dec 13 '22 at 15:40
  • 1
    Yes! Exactly. Now if you understand how $n^Tn$ is the projection onto a unit vector $n$, and how this expression generalizes to an orthonormal basis, this should tell you why you need to orthonormalize (the rows of) $J_f$. – Nicholas Todoroff Dec 13 '22 at 15:51
  • @NicholasTodoroff I am still digesting this comment. In the hypersurface case with $p=1$, $n=\nabla f/|\nabla f|$ then $n^Tn=1$. What do you mean by "how $n^T n$ is the projection onto a unit vector $n$"? I obviously understand $n$ is the unit-normal vector but I am getting thrown of by the phrasing a little. Otherwise thank you a lot for these comments, they have been very illuminating! I will likely compile them into an answer with my update perhaps with some examples, if you do not mind. – Nap D. Lover Dec 13 '22 at 23:35
  • 1
    I apologize, I meant $nn^T$. If we apply $nn^T$ to a vector $x$, using dot product notation we get $nn^Tx = (x\cdot n)n$; if $n$ is a unit vector, then this is exactly the orthogonal projection of $x$ onto the 1D subspace spanned by $n$. So if instead we have orthonormal $n_1, n_2, \dotsc, n_k$, how do we orthogonally project $x$ onto their span? How would we write this in matrix form? Also take note of what happens if $n_1,\dotsc,n_k$ are represented as row vectors instead of column vectors. Finally, if $P$ is an orthogonal projection, what is $I - P$? – Nicholas Todoroff Dec 14 '22 at 00:51
  • If you would like to wrap everything up into a complete answer, I do not mind at all. – Nicholas Todoroff Dec 14 '22 at 00:51

0 Answers0