1

"The main purpose of this note is pedagogical."

i.e. $$a^{2}+ b^{2}+ c^{2}- ab- bc- ca=\left ( c- a \right )^{2}- \left ( a- b \right )\left ( b- c \right )$$ $$b^{2}- 4ac=\left ( c- a \right )^{2}- \left ( a- b+ c \right )\left ( a+ b+ c \right )$$ The right sides has fewer multiplication signs than the left ones. A famous application, what I know most that related to these results is shortening the running time of programming. I wondered about the method that Strassen did use to minimize the number of multiplication signs like that. I need to the help, thanks a real lot !
Her view (@VeronicaPhan, June 9 '21). However, should not be up to which way of decomposition. The only solution is to find the perfect-fitting decomposition in each scenario.
For the above example of mine, that is $a^{2}+ b^{2}+ c^{2}- ab- bc- ca= M^{2}+ N\left ( M+ N \right )\quad{\rm with}\;M:=c- a\!,{\rm and}\;N:=a- b.$ Interpolative decomposition kills it in an unnatural way $$\begin{bmatrix} 1 & -1 & 0\\ 0 & 1 & -1\\ -1 & 0 & 1 \end{bmatrix}= \begin{bmatrix} 1 & -1\\ 0 & 1\\ -1 & 0 \end{bmatrix}\begin{bmatrix} 1 & 0 & -1\\ 0 & 1 & -1 \end{bmatrix}= \begin{bmatrix} -1 & 1\\ 0 & -1\\ 1 & 0 \end{bmatrix}\begin{bmatrix} -1 & 0 & 1\\ 0 & -1 & 1 \end{bmatrix}\!.$$ Hence $a^{2}+ b^{2}+ c^{2}- ab- bc- ca= \left ( c- a \right )\!^{\!2}- \left ( a- b \right )\!\left ( b- c \right )\!.$

  • 7
    These issues are very very special, and, as you are still a student, I don't advise you to lose time on them. Moreover, with new architectures, where multiplication is not that much more time consuming than addition, these researches have lost a part of their practical appeal... – Jean Marie Dec 21 '20 at 09:04
  • 4
    Note that rhs of the second expression is not much more efficient than its left side counterpart, since one multiplication on the left is not a "real multiplication" operation, but it is in fact a multiplication by a constant. Even more, the constant ($4$) is the power of $2$, so the rhs may not be more efficient when $a,b,c$ represent matrices or multi-precision numbers. – g.kov Jan 18 '21 at 15:29
  • 2
    I remain convinced that you should edit your question to focus on polynomials, otherwise your question is extremely, extremely broad and vague, which weakens it. – Rodrigo de Azevedo Mar 07 '21 at 01:45
  • 1
    It is indeed "strange" because there are infinitely many admissible matrices and you picked a "non-nice" one. Amusingly, it seems that I accidentally answered your question when answering the other question. – Rodrigo de Azevedo Mar 07 '21 at 02:56
  • 1
    Have you taken a look at this answer of mine? In it, I use nuclear norm minimization in CVXPY to find a terse SOS decomposition. – Rodrigo de Azevedo Mar 07 '21 at 02:58
  • 1
    You can use the same rank-minimization approach here. It's a generalization of SOS, but needs neither symmetry nor positive semidefiniteness. – Rodrigo de Azevedo Mar 07 '21 at 03:28

1 Answers1

1

$$b^2 - 4ac = \begin{bmatrix} a\\ b \\ c\end{bmatrix}^\top \begin{bmatrix} 0 & t_1 & t_2 - 4 \\ -t_1 & 1 & t_3 \\ -t_2 & - t_3 & 0\end{bmatrix} \begin{bmatrix} a\\ b \\ c\end{bmatrix} $$

We would like to find $t_1, t_2, t_3 \in \Bbb R$ such that the rank of the matrix is minimized. The easy first step would be to force the matrix to be rank-$2$ by making its determinant vanish. Note that

$$\det \begin{bmatrix} 0 & t_1 & t_2 - 4 \\ -t_1 & 1 & t_3 \\ -t_2 & - t_3 & 0\end{bmatrix} = t_2 (t_2 - 4) - 4 t_1 t_3$$

Via visual inspection, after some work, we conclude that $(t_1, t_2, t_3) = \color{blue}{(-1, 2, 1)}$ does make the determinant vanish. Via the eigendecomposition,

$$\begin{bmatrix} 0 & -1 & -2\\ 1 & 1 & 1\\ -2 & -1 & 0\end{bmatrix} = \begin{bmatrix} \color{red}{-1}\\ 0 \\ \color{blue}{1}\end{bmatrix} \begin{bmatrix} \color{red}{-1}\\ 0 \\ \color{blue}{1}\end{bmatrix}^\top - \begin{bmatrix} \color{red}{1}\\ \color{magenta}{-1} \\ \color{blue}{1}\end{bmatrix} \begin{bmatrix} \color{red}{1}\\ \color{magenta}{1} \\ \color{blue}{1}\end{bmatrix}^\top$$

and, thus,

$$b^2 - 4 a c = \left( \color{blue}{c} - \color{red}{a} \right)^2 - \left( \color{red}{a} \color{magenta}{- b} + \color{blue}{c} \right) \left( \color{red}{a} + \color{magenta}{b} + \color{blue}{c} \right)$$


SymPy code

>>> from sympy import *
>>> Q = Matrix([[ 0,-1,-2],
                [ 1, 1, 1],
                [-2,-1, 0]])
>>> Q.rank()
2
>>> (V,D) = Q.diagonalize()
>>> 
>>> V
Matrix([
[ 1,  1, -1],
[-1, -2,  0],
[ 1,  1,  1]])
>>> 
>>> D
Matrix([
[-1, 0, 0],
[ 0, 0, 0],
[ 0, 0, 2]])
>>> 
>>> V**-1
Matrix([
[   1,  1,    1],
[-1/2, -1, -1/2],
[-1/2,  0,  1/2]])
  • How do we obtain$$\begin{bmatrix} 0 & -1 & -2\ 1 & 1 & 1\ -2 & -1 & 0 \end{bmatrix}= \begin{bmatrix} -1\ 0\ 1 \end{bmatrix}\begin{bmatrix} -1\ 0\ 1 \end{bmatrix}^{\top}- \begin{bmatrix} 1\ -1\ 1 \end{bmatrix}\begin{bmatrix} 1\ 1\ 1 \end{bmatrix}^{\top}{\it ?}$$ –  Jul 15 '21 at 09:05
  • 1
    @Da-iCE From the eigendecomposition. Take a look at the 1st and 3rd columns of eigenvector matrix $V$ and the 1st and 3rd rows of $V^{-1}$. – Rodrigo de Azevedo Jul 15 '21 at 11:07
  • Roger that. Sir, how about the polynomials with the higher degree? Beside @haidangel talked Strassen's algorithm so much, what is your plan of solving? –  Jul 15 '21 at 11:59
  • 1
    @Da-iCE Perhaps this provides an example of what is possible. – Rodrigo de Azevedo Jul 15 '21 at 12:10