2

A quadric is a surface in the $3D$ space, described by

$$ r^T A r + r^T b + c = 0 $$

where $r =[x,y,z]^T $ is the coordinate vector of a point on the surface, $A$ is a symmetric $3 \times 3$ matrix, $b$ is a $3 \times 1$ vector, and $c$ is a scalar.

Two such surfaces intersect in a curve, and three such surfaces will intersect in either an infinite number of points or a finite number of points that does not exceed $8$ points.

Describe how to find these intersection points (provided they are finitely numbered).

What I have tried:

I approached this problem in exactly the same way that I used to solve this problem.

First I rewrote each of the quadric equations in the following format

$$ r^T Q_i r = 0, \hspace{15pt} i = 1,2,3 $$

where $r =[x, y, z, 1]^T $, and

$Q_i = \begin{bmatrix} A_i && \frac{1}{2} b_i \\ \frac{1}{2} b_i^T && c_i \end{bmatrix} $

So, now I have three $4 \times 4$ matrices $Q_1, Q_2, Q_3$ corresponding to the three quadrics.

I then proceeded to define a linear combination of $Q_1$ and $Q_2$ as follows

$ Q_\alpha = Q_1 + \alpha Q_2 $

The determinant of $Q_\alpha$ is a quartic (degree 4) polynomial to the parameter $\alpha$. By finding the roots of this quartic polynomial, a value for $\alpha$ can be chosen such that $Q_\alpha$ is rank deficient, i.e. has at least one zero eigenvalue.

From here, and noting that if $r^T Q_1 r = r^T Q_2 r = 0 $ then $r^T Q_\alpha r = 0$; hence we can search for candidate solutions from the last equation, and then we can impose the following conditions on these solution candidates to ensure they satisfy the original three equations of the quadrics.

  1. The fourth component of $r$ is $1$.
  2. $r^T Q_1 r = 0$ (which will also ensure that $r^T Q_2 r = 0$ )
  3. $r^T Q_3 r = 0$

That was an outline of what I have tried.

1 Answers1

1

We want to find the intersection of the three quadrics:

$\mathbf{r}^T Q_1 \mathbf{r} = 0, \mathbf{r}^T Q_2 \mathbf{r} = 0, \mathbf{r}^T Q_3 \mathbf{r} = 0 $

where $\mathbf{r} = [ x,y, z, 1]^T$ and $Q_1, Q_2, Q_3$ are symmetric $4 \times 4 $ matrices.

Consider the matrix $Q_\alpha = Q_1 + \alpha Q_2 $

Solution of $\mathbf{r}^T Q_1 \mathbf{r} = 0, \mathbf{r}^T Q_2 \mathbf{r} = 0$ are obviously solutions of $\mathbf{r}^T Q_\alpha \mathbf{r} = 0 $.

Now by choosing $\alpha $ such that $| Q_\alpha | = 0 $ we can simplify finding solution candidates. The determinant $| Q_\alpha | $ is a quartic polynomial in $\alpha$ and we can find its roots numerically or through a closed-form formula. Let $\alpha_1$ be one of these roots, and let

$Q^* = Q_1 + \alpha_1 Q_2 $

Then $Q^*$ is rank deficient. And being symmetric, it can be diagonalized into

$Q^* = R D R^T $

where

$D = \begin{bmatrix} D_{11} && 0 && 0 && 0 \\ 0 && D_{22} && 0 && 0 \\ 0 && 0 && D_{33} && 0 \\ 0 && 0 && 0 && 0 \end{bmatrix} $

And $R$ is an orthogonal matrix.

Now we want to solve

$ \mathbf{r}^T R D R^T \mathbf{r} = 0 $

Define $\mathbf{u} = R^T \mathbf{r}$, equivalently, $\mathbf{r} = R \mathbf{u}$, then

$\mathbf{u}^T D \mathbf{u} = 0$

Depending on $D_{11}, D_{22}, D_{33}$ we have several cases to consider. I will discuss two important cases:

  • Case I: $D_{11}, D_{22} \gt 0 , D_{33} \lt 0 $
  • Case II: $D_{11} \gt 0, D_{22} \lt 0 , D_{33} = 0 $

For Case I, the candidate solution for $\mathbf{u}$ is

$ \mathbf{u} = \begin{bmatrix} \dfrac{1}{\sqrt{D_{11}}} t \cos(\theta) \\ \dfrac{1}{\sqrt{D_{22}}} t \sin(\theta) \\ \dfrac{ t }{\sqrt{- D_{33}}} \\ r \end{bmatrix}$

where $t, \theta, r $ are real parameters.

For Case II, the candiate solution for $u$ is

$\mathbf{u} = \begin{bmatrix} \dfrac{t}{\sqrt{D_{11}}} \\ \dfrac{\pm t }{\sqrt{- D_{22}}} \\ s \\ r \end{bmatrix} $

where $t, s, r$ are real parameters.

There are three conditions oon the parameters that will determine their values

  1. The fourth entry of $\mathbf{r}$ is $1$. This translates into $$ [0, 0, 0, 1]R \mathbf{u} = 1 $$
  2. $\mathbf{r}^T Q_1 \mathbf{r} = 0 $. This translates into $$ \mathbf{u}^T R^T Q_1 R \mathbf{u} = 0 $$ Note that this condition ensures that $\mathbf{r}^T Q_2 \mathbf{r} = 0$ as well.
  3. $\mathbf{r}^T Q_3 \mathbf{r} = 0$. This translates into $$\mathbf{u}^T R^T Q_3 R \mathbf{u} = 0 $$

There is a lot of details in these three conditions.

Let's take Case I first. Applying the first condition to $u$, we will get the equation

$t(a \cos(\theta) + b \sin(\theta) + c ) + dr = 1 \hspace{15pt} (I.1) $

where $a,b,c,d$ are constants that depend on the entries of the fourth row of matrix $R$ and $D_{11}, D_{22}, D_{33}$ in a straight forward way.

The second of these conditions will result in the following equation

$t^2 f_2(\theta) + t r f_1 (\theta) + r^2 f_0 = 0 \hspace{15pt} (I.2) $

where

$f_2(\theta) = C_1 \cos(\theta) + C_2 \sin(\theta) + C_3 \cos(2 \theta) + C_4 \sin(2 \theta) + C_5 \\ f_1(\theta) = C_6 \cos(\theta) + C_7 \sin(\theta) + C_8 \\ f_0 = C_9$

where $\{C_i\}, i = 1, .., 9$ are constants that depend on the entries of the matrix $R^T Q_1 R$, and the parameterization given above for $u$.

An equation with an identical structure to the above can derived when applying the third condition, and we end up with

$t^2 g_2(\theta) + t r g_1(\theta) + r^2 g_0 = 0 \hspace{15pt} (I.3) $

And we want to find $t, \theta, r $ that will satisfy $(I.1), (I.2)$, and $(I.3)$.

Substituting from $(I.1)$ we can write $r$ in terms of $t$, and then substitute that into $(I.2)$ and $(I.3)$, and end up with the equations

$ t^2 F_2(\theta) + t F_1(\theta) + F_0 = 0 \hspace{15pt} (I.4)$

and

$t^2 G_2(\theta) + t G_1(\theta) + G_0 = 0 \hspace{15pt} (I.5) $

If $t$ satisfies both equations $(I.4)$ and $(I.5)$ then $\theta$ must satisfy the following condition,

$(F_2 G_0 - F_0 G_2)^2 + (F_0 G_1 - G_0 F_1)(F_2 G_1 - F_1 G_2) = 0 \hspace{15pt}(I.6) $

To solve for $\theta$ using $(I.6)$, one can either transform it into a polynomial to degree $8$ in the variable $ z = \tan \dfrac{\theta}{2} $ and find the real roots using one of the standard numerical methods for finding polynomial roots, or one could (and this is what I did) scan the interval $[0, 2\pi) $ and use simple bracketing and bisection method (for example) to find all the possible real roots. Once the roots $\theta$ are found, one can solve for the corresponding $t$ and $r$, thus determining the vector $\mathbf{u}$ and consequently the vector $\mathbf{r}$.

That was a summary of the method for Case I.

Case II is much simpler, as the first condition leads to an equation of a plane in the variables $t, s, r$, which can be parameterized using only two variables (because it is a two-dimensional space). Apply the second and third conditions leads to equations of conics in the two variables. So the problem reduces to finding the intersection of these two conics. The solution of which can be found in my answer to this question

Here is a link to an Excel file that contains the source code for the function "intersect_three_quadrics" which solves a system of three quadratic equations in three unknowns, and also other functions that solve a system of two quadratic equations in two unknowns. These are included in this file as macros (VBA script). Click on the link, to open the online file, then click on "Editing" and choose "Open in Desktop App". This will open the file in your desktop Excel program. Click "View" then select "Macros".

  • You equation (I.6) reminds me of a general and useful formula for the matrix associated with the (generalised) cone issued from a point exterior to a certain quadric and tangent to a certain quadric that I explain in an answer of mine here. – Jean Marie Sep 30 '22 at 08:29
  • That's interesting. –  Sep 30 '22 at 08:38