4

Let

$${\bf M} = \begin{pmatrix} m_1 & m_2 &\cdots & m_\ell \\ m_2 & m_3 &\cdots & m_{\ell+1} \\ \vdots & \vdots &\ddots & \vdots \\ m_\ell & m_{\ell+1} &\cdots & m_{2\ell-1} \end{pmatrix} \in (\mathbb{F}_q)^{\ell \times \ell}$$

be a general Hankel matrix. If there exist $m \in \mathbb{F}_q$, $\boldsymbol{\omega} \in \mathbb{F}_q^\ell$ and ${\bf y} \in \mathbb{F}_q^\ell$ such that

$${\bf M} \boldsymbol{\omega} = \begin{pmatrix} m_{\ell+1}\\ \vdots\\ m_{2\ell-1}\\ m \end{pmatrix}, \qquad {\bf M} {\bf y} = \begin{pmatrix} {\bf 0}_{\ell-1}\\ 1 \end{pmatrix}$$

then $\bf M$ is nonsingular.

  1. I attempted to use proof by contradiction. Assuming that the matrix is non-invertible, there exists a non-zero vector $\bf x$ such that ${\bf M x} = {\bf 0}$. By leveraging the given information that $\bf My$, I can deduce that the last component of $\bf x$ must be zero. My goal is to derive a contradiction by showing that this $\bf x$ must necessarily be the zero vector. However, I have yet to find an appropriate method to reach this conclusion.

  2. However, my approach does not make use of the properties of Hankel matrices, and I am a beginner in this area. Could you kindly provide some guidance and relevant papers for me to study in order to resolve this issue? I would be very grateful!


Motivation

This problem comes from coding theory, namely, the decoding of Reed-Solomon codes.

Suneves
  • 73

1 Answers1

2

That was a challenging question. It seems that this holds on any field. Here is a proof:

Let us proceed by induction on $\ell$. The case $\ell=1$ is trivial (and $\ell=2$ is easy).

Suppose that this property is proven up to $\ell-1$.

Let us denote $t$ the maximal index such that $y_t\neq 0$. (We will reduce the proof to the $t-1$ case.) The existence of ${\bf y}$ yields:

$$\begin{cases} \sum_{i=1}^tm_{k+i-1}y_i=0 & \textrm{if}\ k<\ell\\ \sum_{i=1}^tm_{\ell+i-1}y_i=1 & \textrm{otherwize}\end{cases}$$

But these equalities also gives some coordinates of the images of shifts of ${\bf y}$. Denoting $${\bf T}=\begin{pmatrix} 1 & 0 & \dots & 0 & y_1 & 0 & \dots & 0\\ 0 & 1 & \ddots & \vdots & y_2 & y_1 & & \vdots\\ & & \ddots & 0 & \vdots & y_2 & \ddots & 0\\ & & \ddots & 1 & y_{t-1} & \vdots & \ddots & y_1\\ \vdots & & & 0 & y_t & y_{t-1} & \vdots & y_2\\ & & & & \ddots & \ddots & \ddots & \vdots \\ & & & & & 0 & y_t & y_{t-1}\\ 0 & & & \dots & & & 0 & y_t \end{pmatrix}$$ one has: $${\bf M}{\bf T}= \begin{pmatrix} m_1 & \dots & m_{t-1} & 0 & & 0\\ m_2 & & m_t & 0 & & 0\\ & & & & & \vdots\\ \vdots & & \vdots & \vdots & & 0\\ & & & & \diagup & 1\\ & & & 0 & \diagup & *\\ m_\ell & \dots & m_{\ell+t-1} & 1 & * & * \end{pmatrix}.$$ Let us denote ${\bf R}={\bf M}{\bf T}$.

Since ${\bf T}$ is upper-triangular, with $\det({\bf T})=y_t^{\ell+1-t}\neq0$, it is invertible and we can give a description of its inverse: $${\bf T}^{-1}=\begin{pmatrix} 1 & 0 & \dots & 0 & * &\dots & *\\ 0 & \ddots & \ddots & \vdots & & & \\ & \ddots & 1 & 0 & \vdots\\ & & 0 & 1 & * & & \vdots\\ \vdots & & & 0 & y_t^{-1} & \ddots & \\ & & & & \ddots & \ddots & *\\ 0 & & & \dots & & 0 & y_t^{-1} \end{pmatrix}.$$ And naturally, one has ${\bf M}={\bf R}{\bf T}^{-1}$.

Developing the determinant of ${\bf R}$ along the last columns (from $t$ to $\ell$), one notice that $\det({\bf R})=(-1)^*\det({\bf M}')$, where: $${\bf M}'=\begin{pmatrix} m_1 & m_2 & \dots & m_{t-1}\\ m_2 & m_3 & \diagup & m_t\\ \vdots& \diagup & \diagup & \vdots\\ m_{t-1} & m_t & \dots & m_{2t-1}\\ \end{pmatrix}.$$ So $\det({\bf M})=(-1)^*y_t^{-(\ell+1-t)}\det({\bf M}')$ and ${\bf M}$ is non singular if and only if ${\bf M}'$ is.

It remains to prove that ${\bf M}'$ also satisfies the same two conditions.

-- If we rewrite just the $t-1$ first lines of the product ${\bf M}={\bf R}{\bf T}^{-1}$, one has: $${\bf M}_{[1,t-1]}=\begin{pmatrix} m_1 & \dots & m_{t-1} & m_t & \dots & m_\ell\\ m_2 & & m_t & m_{t+1} & & m_{\ell+1}\\ \vdots & & \vdots & \vdots & & \vdots \\ & & & & & \\ m_{t-1} & \dots & m_{2t-1} & m_{2t} & \dots & m_{\ell+t-2}\\ \end{pmatrix}= \begin{pmatrix} m_1 & \dots & m_{t-1} & 0 & \dots & 0\\ m_2 & & m_t & 0 & & 0\\ \vdots & & \vdots & \vdots & & \vdots\\ & & & & & \\ m_{t-1} & \dots & m_{2t-1} & 0 & \dots & 0\\ \end{pmatrix}{\bf T}^{-1}.$$ In particular the $t$-th column is a linear combinaison of the columns of ${\bf M}'$. That's the first hypothesis. But it is also the case for all the other columns. Moreover the existence of $\boldsymbol{\omega}$ for ${\bf M}$ assures that the column $\phantom{ }^t(m_{\ell+1},\dots, m_{\ell+t-1})$ is also a linear combinaison of the columns of ${\bf M}'$.

-- Similarly, let us write the lines from 2 to $t$ of the product ${\bf M}={\bf R}{\bf T}^{-1}$: $${\bf M}_{[2,t]}=\begin{pmatrix} m_2 & \dots & m_t & m_{t+1} & \dots & m_{\ell+1}\\ m_3 & & m_{t+1} & m_{t+2} & & m_{\ell+2}\\ \vdots & & \vdots & \vdots & & \vdots \\ & & & & & \\ m_t & \dots & m_{2t} & m_{2t+1} & \dots & m_{\ell+t-1}\\ \end{pmatrix}= \begin{pmatrix} m_2 & \dots & m_t & 0 & \dots & 0\\ m_3 & & m_{t+1} & 0 & & 0\\ \vdots & & \vdots & \vdots & & \vdots \\ & & & & & 0\\ m_t & \dots & m_{2t} & 0 & \dots 0 & 1\\ \end{pmatrix}{\bf T}^{-1}.$$ Focusing on the last column, one sees that the column $\phantom{ }^t(0,\dots, 0,1)$ is given the non-zero coefficient $y_t^{-1}$ and all the other columns are linear combinations of the columns of ${\bf M}'$, therefore it is a linear combinations of the columns of ${\bf M}'$. That's the second hypothesis.

By induction, ${\bf M}'$ is non singular and so is ${\bf M}$.