0

I'm very interested at the answer posted by p.s. of this question. I have trouble understanding the conclusion:

This can be solved numerically via a 1-dimensional root-finding method.

Could someone please help me understand what this is referring to?

Also, is this the right way to ask ? I'm sad I can't directly ask to the user or on the original question.

Tripo
  • 43
  • Why can't you ask?! It was posted years ago, but the person is still around ("last seen this week"). –  Apr 30 '25 at 19:33
  • 1
    Because i have no idea how to contact them, except posting an answer to the post (i can't comment) wich is discouraged. – Tripo Apr 30 '25 at 20:03

1 Answers1

1

The related question says this:


[... w]here $y^*$ is the value satisfies the following: $$ 1 = P_2\left( \Pi_{D_1} \left( \begin{bmatrix} Z & z \\ z^T & y^* \end{bmatrix} \right) \right) $$

This can be solved numerically via a 1-dimensional root-finding method.

I'm not sure if it's possible to avoid calculating a full eigenvalue decomposition every time you project onto to $D_1$.


On the right hand side, you have an expression that involves a number called $y^{*}$ Let me rewrite it using $s$, and slightly shuffle things while I'm at it:

$$ 0 = P_2\left( \Pi_{D_1} \left( \begin{bmatrix} Z & z \\ z^T & s \end{bmatrix} \right) \right) - 1 $$

Now on the right we have something that depends on the number $s$ --- let's write $$ f(s) = P_2\left( \Pi_{D_1} \left( \begin{bmatrix} Z & z \\ z^T & s \end{bmatrix} \right) \right) - 1 $$ and we're hoping to find a value of $s$ with $f(s) = 0$. Such a thing is called a root of $f$, and a pretty typical approach to finding a root is something like bisection:

First, you find a value $s_1$ of $s$ where $f(s_1) < 0$; then you find another where $f(s_2) > 0$. For the sake of argument, I'm going to say that $s_1 < s_2$.

Observing that $f$ is a continuous function of $s$, we know that for some number between $s_1$ and $s_2$, there's a value $q$ with $f(q) = 0$. This is where bisection comes in. We repeatedly do the following:

  1. Let $t = \frac{s_1 + s_2}{2}$.
  2. Let $c = f(t)$. If $c > 0$, replace $s_2$ with $t$ (i.e., set $s_2$ to be the number $t$. If instead $c < 0$, replace $s_1$ with $t$. If $c = 0$, then you've hit the jackpot, because you have found a number $t$ with $f(t)= 0$.

Each time you repeat this process, the length of the interval $[s_1, s_2]$ is divided in half. Do it ten times, and the interval shrinks by a factor of 1024. Do it 100 times, and it shrinks by a factor of more than $10^{30}$. At that point, whatever mechanism you're computing with will probably be running out of precision, so you stop and report the last found value of $t$ as a very good approximation of $y^{*}$.

BTW, bisection is a really basic root-finding algorithm, but it's also very simple, which has some charm.

You might be wondering about that first step -- finding values of $s$ that make $f$ positive and make it negative. I think that in this case you can do that by trying more and more positive values for $s$, and then more and more negative values, until you get ones with opposite signs, but to be honest, I haven't looked carefully enough at the definitions of the projection maps to be certain that this is correct.

John Hughes
  • 100,827
  • 4
  • 86
  • 159