Consider the least squares problem
$f(x;A,b) = \|Ax-b\|_2^2$
and define $x^*$ the minimizer of $f(x;\hat A,\hat b)$, and $\hat x$ the minimizer of $f(x; A_2, b_2)$.
I want to put some bound on $\|x^* - \hat x\|$.
Looking through Golub/Van Loan, I see a lot of stuff that is basically some function of $\epsilon = \max\{\|A-\hat A\|, \|b-\hat b\|\}$, but in some sense that's not the best you can do. For example, if
$A =\left[\begin{matrix} 0 \\ C\end{matrix}\right], \hat A =\left[\begin{matrix} C\\ 0\end{matrix}\right], b =\left[\begin{matrix} 0 \\ d\end{matrix}\right], \hat b =\left[\begin{matrix} d\\ 0\end{matrix}\right]$
then $\epsilon = \max\{2\|C\|, 2\|d\|\}$ which may be very large, but $\|x^*-\hat x\| = 0$.
Are there existing bounds that take this into account? It has to be some bound on some function that involves $a_i$ AND $b_i$ (for $A= [a_1^T, ...]$ and $b = [b_1;...]$) and not just some stuff on range space of $A$.
I suspect there is some result in machine learning, since this is basically about regression solutions if you get enough "important" samples. Can anyone point me to any known results?
Thank you!
I played around a bit with SVD terms, which is somewhat illuminating; by bounding $b^TU-\hat b^T\hat U$ and other terms you can get something. I just feel like there should be an established result somewhere!
– Y. S. Aug 29 '16 at 04:37