2

The condition number for an invertible matrix $A$ is defined as follows

$$\mathcal{k}(A) := \|A^{-1}\| \|A\|$$

where $\|\cdot\|$ is the Euclidean norm. If $A$ is symmetric, then

$$\mathcal{k}(A)= \frac{\lambda_{\max}(A)}{\lambda_{\min}(A)}$$

Does anyone know where I can find the proof for that? In my numeric script, one is given two symmetric and positive definite matrices $A$ and $B$. Then, in a proof it is used that

$$\mathcal{k}(B^{-1} A )= \frac{\lambda_{\max}(B^{-1}A)}{\lambda_{\min}(B^{-1}A)}$$

Why is this true? $B^{-1}{A}$ is not symmetric in general. Did I miss something?

Adam
  • 3,819

1 Answers1

0

Someone postet the answer already but then deleted it, because he belived it was wrong. I didnt see why it was not working. So whats wrong with this proof:

For every diagonalizable matrix $A$ it holds $k(A)=\lambda_{\max}(A) / \lambda_{\min}(A)$. To see this, take a basis of normalised eigenvectors $v_1,\dots,v_n$ of $A$ to the eigenvalues $\lambda_1,\dots,\lambda_n$. Then, for $x=\sum_{i=1}^n \alpha_i v_i$

$ ||Ax||^2 = \langle Ax,Ax \rangle = \sum_{i,j=1}^n \lambda_i \overline{\lambda_j} \langle\alpha_i v_i , \alpha_j v_j\rangle \leq \max |\lambda_i|^2 \sum_{i,j=1}^n \langle\alpha_i v_i , \alpha_j v_j\rangle = \max |\lambda_i|^2 ||x||^2 $

and so $||A|| \leq \max |\lambda_i|$.

For $|\lambda_k| = \max |\lambda_i| $ we get $||A v_k|| = ||\lambda_k v_k|| =|\lambda_k| $ and therefore $||A||=\max |\lambda_i|$

Since $A^{-1}$ has the eigenvalues $1/\lambda_1,\dots,1/\lambda_n$ we get $||A^{-1}||= \max (1/|\lambda_i|) = 1/(\min \lambda_i)$.

It is shown here: Why is this matrix product diagonalizable? that $B^{-1}A$ is diagonalizable. That's it, right?

Adam
  • 3,819
  • If A is diagonalizable matrix but k(A)=λmax(A)/λmin(A)k(A)=λmax(A)/λmin(A) is not true. For example : http://www.wolframalpha.com/input/?i=%28%7B%7B2%2C1%7D%2C%7B1%2C1%7D%7D%29 – Thomas Edison Jan 27 '16 at 11:23