6

PLEASE FIND THE EDITED VERSION OF THIS QUESTION HERE: Asymptotic behavior of the minimum eigenvalue of a certain Gram matrix with linear independence I WILL ALSO PUT UP A BOUNTY FOR THE EDITED VERSION. Is there a lower bound for the determinant or minimum eigenvalue of the following $d$ by $d$ matrix in terms of $d$?

$$\Gamma=\left( {\begin{array}{cc} I & B \\ B^{*} & I \\ \end{array} } \right)$$ Where $I$ is the identity matrix and the the moduli of entries of $B$ and those of its conjugate $B^{*}$ are all equal to $\frac{1}{\sqrt{d}}$. Also the blocks are all $\frac{d}{2}$by$\frac{d}{2}$. It is a Gram matrix and further assume that the rows and columns are linearly independent. Hence we know that the lower bound is larger than zero but can we say anything more?

For simplicity we can assume the field of the matrix is real. Hence the entries of the off-diagonal blocks ($B$ and $B^{T}$) are $\pm\frac{1}{\sqrt{d}}$.

I appreciate any input very much!

  • 1
    Are you saying that $B$ is a gram matrix? And does $B^*$ mean that you take the conjugate of all entries, or is it the conjugate transpose? – Ben Grossmann Jun 18 '15 at 20:10
  • B is not a Gram matrix, it is a Hadamard sub-matrix if we want to put a name on it. $\Gamma$ is the Gram matrix. $B^{*}$ is the conjugate transposed of $B$. Thanks for pointing that out. So $\Gamma$ is Hermitian. – James Smithson Jun 18 '15 at 21:09
  • The matrix $\Gamma$ is not positive definite if all entries of $B$ are equal to $\frac1{\sqrt d}$. Since your assumption implies positive definiteness, a possible lower bound has to take into account the distribution of positive and negative entries in $B$. which seems to be a hard problem. – daw Jun 19 '15 at 06:41
  • @daw Yes of course they cant be all equal to $\frac{1}{\sqrt{d}}$. Their moduli are. To see this, consider a Gram matrix assigned to a linearly independent set of vectors chosen from two orthonormal sets of vectors $V_{1}$ and $V_{2}$, then we have $| <\omega_{i}|\nu_{j}>|=\frac{1}{\sqrt{d}}$ for all $|\omega_{i}>\in V_{1}$ and $|\nu_{j}>\in V_{2}$. – James Smithson Jun 19 '15 at 10:39

2 Answers2

1

Using one of the block matrix determinant formulas, we find that $$ \det(\Gamma) = \det(I - BB^*) $$ So, let $s_1,s_2,\dots,s_{d/2}$ denote the singular values of $B$ in decreasing order. We have $$ \det(\Gamma) = \prod_{i=1}^{d/2} (1-s_i) $$ Now, we note that $B$ is a $d/2$ by $d/2$ matrix whose entries have magnitude $1/\sqrt{d}$. Using the Frobenius norm, we have the upper bound $$ \sigma_1(B) \leq \sqrt{\sum_{i,j =1}^{d/2} |B_{ij}|^2} = \sqrt{d/4} = \frac{d}{2} $$ (We could similarly use any of the Schatten $p$-norms, or any other unitarily invariant norm).

Thus, we have $$ \det(\Gamma) = \prod_{i=1}^{d/2} (1-s_i) \geq \prod_{i=1}^{d/2} \left(1-\frac {\sqrt d}{2}\right) = \left(1-\frac {\sqrt d}{2}\right)^{d/2} $$


In fact, I believe we get a similar bound (if not the same bound) using the Gershgorin circle theorem.

Ben Grossmann
  • 234,171
  • 12
  • 184
  • 355
  • Using Gershgorin theorem, again we have $\frac{d}{2}\frac{1}{\sqrt{d}}=\frac{\sqrt{d}}{2}$ as the upper bound for the maximum eigenvalue. – James Smithson Jun 18 '15 at 21:17
  • I may have made an error there – Ben Grossmann Jun 18 '15 at 21:26
  • 1
    I've fixed it; I used the wrong norm there. Note that there are $d^2/4$ entries. – Ben Grossmann Jun 18 '15 at 21:30
  • wait... nope, still wrong. Still at it – Ben Grossmann Jun 18 '15 at 21:33
  • All right, I think that's the best we can do. If we know that $B$ is symmetric, then we could use the entrywise $4$-norm and get a slightly nicer lower bound (the bound that I originally had). – Ben Grossmann Jun 18 '15 at 21:40
  • If $d=16$ for example, then your bound would yield $1$ which is a contradiction. $1$ is an upper bound for the matrix if all the off-diagonal entries are zero and we have identity. Can you give me some reference for the $4$-norm being an upper bound for the singular value of symmetric $B$? – James Smithson Jun 18 '15 at 21:50
  • Even in the beginning your statement is unclear. There are $\frac{d}{2}$ $s_{i}$s, how could you sum over $i=1$ to $d$? – James Smithson Jun 18 '15 at 22:13
  • @JamesSmithson that is an oversight... I guess I've been pretty sloppy with this answer. I can delete it if you think it'll help you get a better one. – Ben Grossmann Jun 18 '15 at 22:33
  • You'd help if you could up-vote the question maybe... but of course you don't have to. Thanks for your input anyway. I'll try to see if we can impose conditions on $B$ to derive a suitable upper-bound for its maximum singular value. – James Smithson Jun 18 '15 at 22:39
  • I found some results on matrices of $1$s and $0$s, but not anything about arbitrary matrices with $1$s and $-1$s. – Ben Grossmann Jun 18 '15 at 22:49
  • 1
    @daw We are sure that $s_{i}\leq 1$. This is because $I-BB^{*}$ is the Schur's complement and positive definite(remember the rows and columns of $\Gamma$ are linearly independent and hence a $d$-dimensional basis(non-orthonormal) for the underlying vector space. But here we're interested in a lower bound away from zero. – James Smithson Jun 19 '15 at 10:18
  • We can apply the results on matrices with 1s and 0s here too: we normalize the first row of $B$ (all ones) and then add it to other rows and divide by two. The determinant remains unchanged and the matrix will have $0$s and $1$s. @Omnomnomnom – James Smithson Jun 30 '15 at 14:13
  • Hmm, $\det(\Gamma)$ should be $\prod(1-s_i^{\color{red}{2}})$, not $\prod(1-s_i)$. – user1551 Jul 02 '15 at 16:07
1

I believe the lower bound on the eigenvalues is 0.

The eigenvalues of $\Gamma=\left[\begin{array}{cc} I & B \\ B^{*} & I \end{array} \right]$ are $1\pm \sigma$ where $\sigma$ is a singular value of $B$. Every eigenvalue has a multiplicity of the corresponding singular value of $B$. The eigenvectors are $\left(\begin{array}{c} u \\ \pm v \end{array}\right)$ where $u$ is a left singular vector of $B$, and $v$ is the corresponding right singular vector of $B$.

$B$ is rather constrained. In the real case, $\hat{B}=\sqrt{d}B$ is a matrix whose entries are all $\pm 1$. The eigenvalues of such a matrix can be as large as $\pm d/2$. We want to construct a family of matrices whose smallest eigenvalue is strictly greater than $-\sqrt{d}$ but as close as possible to that value.

I first found an example with $d=16$ where $\Gamma$ is positive semidefinite. (In my experience, computational experiments with other $d$ show it isn't hard to find $\hat{B}$ with minimum eigenvalues very close to $-\sqrt{d}$.)

If you set: $$\hat{B}=\left[\begin{array}{cccccccc} -1 & 1 & 1 & 1 &-1 &-1 & 1 & 1 \\ 1 &-1 & 1 & 1 &-1 &-1 & 1 & 1 \\ 1 & 1 &-1 &-1 & 1 &-1 & 1 & 1 \\ 1 & 1 &-1 &-1 &-1 & 1 &-1 & 1 \\ -1 &-1 & 1 &-1 &-1 &-1 &-1 & 1 \\ -1 &-1 &-1 & 1 &-1 &-1 &-1 &-1 \\ 1 & 1 & 1 &-1 &-1 &-1 & 1 &-1 \\ 1 & 1 & 1 & 1 & 1 &-1 &-1 & 1 \\ \end{array}\right]$$ and $B=\hat{B}/4$, then $\Gamma$ is positive semidefinite. This $\Gamma$ is a Gram matrix. You can recover the basis using a Cholesky factorization $\Gamma=V^{*}V$. The factor $V$ has a special structure, $V=\left[\begin{array}{cc} I & B \\ 0 & M \end{array} \right]$ where $M$ is the Cholesky factor of $I-B^{*}B$. The square root is guaranteed to exist as this is positive semidefinite.

I also constructed a family of $B$ which make $\Gamma$ positive semidefinite (and which can be altered to instead have vanishingly small eigenvalues). Take $n$ to be any multiple of 36. In the first column of $B$, set all $n/2$ entries to $\frac{1}{\sqrt{n}}$. In the second column, set the first $n/6$ entries to $\frac{1}{\sqrt{n}}$, and the remaining $n/3$ entries to $-\frac{1}{\sqrt{n}}$. In the third column, set the first $5n/36$ entries to $\frac{1}{\sqrt{n}}$, the second $n/36$ entries to $-\frac{1}{\sqrt{n}}$, the third $n/36$ entries to $\frac{1}{\sqrt{n}}$, and the remaining $11n/36$ entries to $-\frac{1}{\sqrt{n}}$. Fill out the rest of $B$ however you wish.

If you compute the Cholesky factorization, you'll find the entries (in $M$) below the first three columns are: $$\left[\begin{array}{ccc} \frac{1}{\sqrt{2}} & \frac{\sqrt{2}}{6} & \frac{\sqrt{2}}{6} \\ 0 & \frac{2}{3} & -\frac{2}{3} \\ 0 & 0 & 0 \end{array}\right]$$ The last row is significant: it means we can find a linear combination of the three columns that are all zero in these entries. Glued with the columns of $B$ above, it means this lies in the span of the first $n/2$ vectors of the standard basis. Note this can be altered slightly to introduce an $\epsilon$ in the lower right entry; the entries of $M$ would be altered but not the entries of $B$. This would give vanishingly small eigenvalues to $\Gamma$.

As a little bit of geometric insight as to what's going on, $B$ is a matrix of inner products. Its largest singular value measures the angle between two spaces: the span of the columns of $\left[\begin{array}{c}I\\0\end{array}\right]$ and the span of the columns of $\left[\begin{array}{c}B\\M\end{array}\right]$. That is $\sigma_{\max}=cos(\theta)$ where $\theta$ is the angle between the spaces. The columns of $B$ always lie in the span of the columns of $I$. We can construct the entries of $M$ (which is upper triangular) so that they make $\left[\begin{array}{c}B\\M\end{array}\right]$ orthonormal, but so that there are one or more zeros on the diagonal. (The columns of $M$ are linearly dependent.)

Questions

Are there other constraints you can put on $B$? For example, if $\hat{B}$ is a Hadamard matrix, then the eigenvalues of $\hat{B}$ are $\pm \sqrt{d/2}$ (each with multiplicity $d/4$). The eigenvalues of $\Gamma$ would be $1\pm 1/\sqrt{2}$.

EDIT:

In your third comment, you're describing the Sylvester construction of Hadamard matrices of order $2^n$, right? If so, why not use a Hadamard matrix, and avoid questions like this about bounding the spectrum? If your concern is in constructing Hadamard matrices of an order that isn't a power of 2, you should know there are other constructions besides the Sylvester construction. That said, the construction of even-order Hadamard matrices is an open research problem.

What you ask in your first comment sounds like a much more open-ended research question. Could you post a new question with some more details about the problem you're actually trying to solve? It would help to understand where you're coming from, and what you really want to do.

For instance, what restrictions do you face in constructing $B$? Or is $B$ simply given to you from somewhere else? If there are no restrictions, why not use a Hadamard matrix to construct $B$? Why the $\pm 1$ restriction on the entries (or unit size restriction in the complex case)? What restrictions, if any, are there on the basis that's generating the Gram matrix?

What size or range of sizes of $B$ are you interested in? Are you also interested in asymptotics? Are you using these for computing, or are you trying to answer an analytical question?

More generally, where is this problem coming from?

EDIT2: I've edited my answer above to correct an error, and to add an infinite family of examples whose eigenvalues are 0 or can be made as close to 0 as desired.

  • Thank you for your answer. Yes if $\hat{B}$ is a Hadamard matrix we would have nice lower bounds. The question is what is the most minimalist set of restrictions that can be put on $\hat{B}$. So far I have thought of this: For $d=2^{n}$ $n\geq 4$, If $\hat{B}$ is symmetric and each row (or consequently column) has at least $\frac{d}{8}$ and at most $\frac{d}{4}$ negatives, we can construct of a $\frac{3}{4}d$ by $\frac{3}{4}d$ orthogonal (Hadamard) matrix of which $\hat{B}$ is a submatrix. – James Smithson Jul 05 '15 at 13:41
  • Due to Cauchy interlacing theorem for symmetric matrices therefore we would have $\sigma_{\max}(\hat{B})\leq\frac{3}{4}$. The point is that for $2^{n}$ $n\geq 4$ there always exists a $\frac{3}{4}d$ by$\frac{3}{4}d$ Hadamard matrix. But the question is if $\hat{B}$ with the restriction on the number of negatives is indeed a submatrix of it. – James Smithson Jul 05 '15 at 13:42
  • This structure is the case when we have two orthonormal bases $V_{1}={|\nu_{1}>,|\nu_{2}>}$ and $V_{2}={|\omega_{1}>,|\omega_{2}>}$ such that: $$|\nu_{1}>=\frac{1}{\sqrt{2}}(|\omega_{1}>+|\omega_{2}>)$$ $$|\nu_{2}>=\frac{1}{\sqrt{2}}(|\omega_{1}>-|\omega_{2}>)$$ Now for $V_{1,2}^{\otimes n}$ $n\geq 4$ we have the described $\hat{B}$ in the previous comments. @Armadillo Jim – James Smithson Jul 05 '15 at 13:45
  • Can we now prove that $\hat{B}$ is a sub-matrix of the $\frac{3}{4}d$-Hadamard matrix? @Armadillo Jim ? – James Smithson Jul 05 '15 at 13:53
  • By the way $\Gamma$ would be the Gram matrix assigned to the set of elements half of which are chosen from $V_{1}^{\otimes n}$ and another half from $V_{2}^{\otimes n}$. – James Smithson Jul 05 '15 at 13:55
  • @JamesSmithson, wow, lots and lots of new questions! I did a little poking around, and I see you've posted a few related questions on SE. Perhaps we can meet offline and chat about the problem you're trying to solve?

    I've edited my answer to address some of your comments. In the mean time, would you be a dear and accept my answer to this question? I feel as though I've answered it as originally posed.

    – Armadillo Jim Jul 05 '15 at 23:53
  • Yes sure. Thank you for your answer. Sure, I can post the complete question. I'll accept your answer. We can discuss the question further after I post it if you're interested. – James Smithson Jul 06 '15 at 00:04
  • Please find the version of the question I promised here: http://math.stackexchange.com/questions/1350777/asymptotic-behavior-of-the-minimum-eigenvalue-of-a-certain-gram-matrix-with-line As always I appreciate any comments. @Armadillo Jim – James Smithson Jul 06 '15 at 00:55
  • Of course the example you gave as you mentioned would make a Gram matrix with linearly dependent rows whereas the question (as originally posed) was about one with linearly independent rows. The modification of the question is an attempt to get an answer even though to a specific case that should be reasonably approachable. – James Smithson Jul 07 '15 at 01:10