3

I was solving some physics problems with linear algebra and found this :

Denote the basis for the vector space as $\mathbf e_i$, $i,1,...,n$. Consider a change of basis $\mathbf e_i\rightarrow \mathbf e'_i$ in which the components of the velocity change via $\mathbf v' = \Lambda \mathbf v$. The kinetic energy (a phyisically observable quantity) cannot change under a change of basis. Then if a matrix $A$ representing the kinetic energy, we have that $A'=S^TAS$, where $S=\Lambda^{-1}$. This type of transformation of a matrix is called a congruence transformation... Now, since $A$ defines a positive definite quadratic form, then we use it to define a scalar product $(\mathbf v,\mathbf w)\equiv A(\mathbf v,\mathbf w)=A_{ij}v_iw_j$. We can always find an orthonormal basis $\mathbf e_i$ for a vector space with scalar product such that $A(\mathbf e_i,\mathbf e_j)=\delta_{ij}$, the Kronecker delta. In this basis the kinetic energy is a sum of squares.

This means that $A$ is now $I$ (as seen from the new basis). My question is, how to find such basis?

If $B$ is the matrix that defines the quadratic form of the potential energy, Is there any relation between the proper eigenvalues of $B$ with respect to another matrix $A$ (vectors $v$: $Bv=\lambda Av$), with this new basis?

This is, can we diagonalise simultaneously two quadratic forms in such a way that one of them is seen from a new basis as $I$, and the other one as a diagonal matrix whose entries are the proper eigenvalues?

More compactly:

Let $A$ and $B$ be real symmetric matrices, $A$ positive definite. Is there a basis and a matrix that diagonalises both $A$ to $I$ and $B$ to $\Lambda=\operatorname{diag}(\lambda_1,\ldots,\lambda_n)$ where $Bv_i=\lambda_iAv_i$? How to get them? Are the proper eigenvectors $v_i$ always orthogonal?

For instance, if $a,b>0$ and $\displaystyle A = \begin{bmatrix}a & 0 \\ 0 & b\end{bmatrix}$, how would one make this congruence transformation to get a basis in which $A=I$? Thanks for your help!

  • I updated my answer in response to your change of question. – Oscar Cunningham Mar 29 '15 at 14:37
  • 1
    What's key in this question, and what makes the tag (linear-algebra) somewhat awkward, is that in this instance you are not thinking of $A$ as representing a linear operator. Rather, you are thinking of $A$ as representing a quadratic form (and hence this belongs really to the domain of multilinear algebra). Then a fundamental theorem of Jacobi states that one can choose a basis in which the matrix corresponding to the quadratic form is diagonal with entries $\pm 1,0$, and a famous theorem of Sylvester states that the numbers of +1s and -1s are the same independent of choice of basis. – user225318 Apr 02 '15 at 19:56

1 Answers1

3

For instance, if $a,b>0$ and $\displaystyle A = \begin{bmatrix}a & 0 \\ 0 & b\end{bmatrix}$, how would one make this congruence transformation to get a basis in which $A=I$?

In this case you can take $S$ to be $\begin{bmatrix}\frac 1{\sqrt{a}} & 0 \\ 0 &\frac 1{\sqrt{b}}\end{bmatrix}$, or in other words taking $\mathbf e'_1=\frac 1{\sqrt{a}}\mathbf e_1$ and $\mathbf e'_2=\frac 1{\sqrt{b}}\mathbf e_2$.

In general the question we want to answer is: Given a basis and an inner product, find a new basis which is orthonormal for that product. This is done by the Gram–Schmidt process.

EDIT: The question has been updated slightly to ask this question:

This is, can we diagonalise simultaneously two quadratic forms in such a way that one of them is seen from a new basis as I, and the other one as a diagonal matrix whose entries are the proper eigenvalues?

The answer is yes. Suppose our real symmetric matrices are $A$ and $B$. As above, find an orthonormal basis so that $A\mapsto I$. Then if we change basis again by an orthogonal matrix (one such that $O^T=O^{-1}$) we will find that the matrix for $A$ is still $I$ (because it becomes $OIO^T=OIO^{-1}=I$. So we are done by the fact that real symmetric matrices can be diagonalised by orthogonal matrices.