-3

$\begin{pmatrix} a_{11} & a_{12} \\ a_{21} & a_{22} \end{pmatrix}\begin{pmatrix} x_1 \\ x_2 \end{pmatrix}=\begin{pmatrix} b_1 \\ b_2 \end{pmatrix}$

If so, how do we call the operation of multiplying a bivector by a vector (transforming vector (x_1, x_2))?

How does it relate to other operations within geometric algebra like the geometric product or the wedge product etc. ?

1 Answers1

2

Since the question is only about bivectors and not the full Geometric Algebra package we can make this post a little more accessible for a wider audience.

A simple bivector is the anti-symmetric part of a tensor (aka Kronecker, aka outer) product of two vectors: $$ a\wedge b=a\otimes b-b\otimes a=\pmatrix{a_1b_1&\dots&a_1b_n\\\vdots&\ddots&\vdots\\a_nb_1&\dots&a_nb_n} -\pmatrix{a_1b_1&\dots&a_nb_1\\\vdots&\ddots&\vdots\\a_1b_n&\dots&a_nb_n}\,. $$ Assuming $a,b$ are column vectors this can also be written as $$ a\wedge b=a b^\top-b a^\top\,. $$ It is well known that, when $a\not= 0\not=b\,,$ $$ \operatorname{rank}(a b^\top)=\operatorname{rank}(b a^\top)=1 $$ and, because the matrix $a\wedge b$ is skew symmetric, $$ \operatorname{rank}(a\wedge b)\ge 2\,. $$ From another MSE post $$ \operatorname{rank}(a\wedge b)\le \operatorname{rank}(a b^\top)+\operatorname{rank}(b a^\top) =2 $$ and it follows now that $$\boxed{\quad\phantom{\Big|} \operatorname{rank}(a\wedge b)=2\,.\phantom{\Big|}\quad} $$ Conclusions

  • An arbitrary matrix is in general not a simple bivector, because it needs to be skew symmetric.

  • Even when it is skew symmetric but of dimension greater than two it is only a simple bivector when its rank is two.

  • A skew symmetric $2\times 2$-matrix is always a simple bivector.

Now to general bivectors

  • A general bivector is a linear combination of simple bivectors. In particular, when $e_1,...,e_n$ are the canonical basis vectors, then $$ \sum_{i,j=1}^n\alpha_{ij} \,e_i\wedge e_j $$ is a bivector. Since $e_i\otimes e_j$ is the $n\times n$-matrix whose only non-zero element is at $(i,j)$ and is one it is clear that every matrix $(\alpha_{ij})$ can be represented as $$ \sum_{i,j=1}^n\alpha_{ij}\,e_i\otimes e_j\,. $$ When that matrix is anti-symmetric this sum can be written as $$ \sum_{i<j}\alpha_{ij}\,(e_i\otimes e_j-e_j\otimes e_i) =\sum_{i<j}\alpha_{ij}\,e_i\wedge e_j\,. $$ That is:

  • Every anti--symmetric $n\times n$-matrix is a (not necessarily simple) bivector.

Kurt G.
  • 17,136