Let $(x_i)_{1 \leq i \leq n}$ be vectors of $\mathbb{R}^d$. Assume that $\sum_{i}x_ix_i^T$ is invertible and let $A=(\sum_{i}x_ix_i^T)^{-1}$. Let $v$ be a non-zero vector of $\mathbb{R}^d$.
A lower bound of $v^TAv$ can be found by noticing that $\sum_{i} x_ix_i^T \preceq \left(\sum_{i} x_i^Tx_i\right)I_d$, which yields $A \succeq \left(\sum_{i} x_i^Tx_i\right)^{-1}I_d$ and $$v^TAv \geq \left(\sum_{i} x_i^Tx_i\right)^{-1}v^Tv.$$ I would like to find an upper bound of $v^TAv$ that has a similar shape as the lower bound (I'd like it to be quite explicit in the vectors $x_i$ and in $v$)
I have noticed that if we denote by $\lambda_{min}(M)$ (resp. $\lambda_{max}(M)$) the smallest (resp. largest) eigenvalue of a matrix $M$, we have $$v^TAv \leq \lambda_{max}(A) v^Tv = \frac{1}{\lambda_{min}\left(\sum_{i}x_ix_i^T\right)}v^Tv.$$
However, this is not very satisfactory to me because it is not "explicit enough" in terms of the vectors $x_i$. Perhaps there could be a non-trivial lower bound of the aforementioned smallest eigenvalue ?