5

I'm trying to understand how vectors, differential forms and multi-linear maps in general transform under change of coordinates. So I start with the simplest case of vectors. Here's my own attempt, please bear with me:

Suppose that $V$ is a vector space and $\alpha=\{v_1,\cdots,v_n\}$ and $\beta = \{w_1,\cdots,w_n\}$ are two ordered bases for $V$. $\alpha$ and $\beta$ give rise to the dual bases $\alpha^*=\{v^1,\cdots,v^n\}$ and $\beta^*=\{w^1,\cdots,w^n\}$ for $V^*$ respectively.

If $[T]_{\beta}^{\alpha}=[\lambda_{i}^{j}]$ is the matrix representation of coordinate transformation from $\alpha$ to $\beta$, i.e.

$$\begin{bmatrix} w_1 \\ \vdots \\ w_n \end{bmatrix}= \begin{bmatrix} \lambda_1^1 & \lambda_1^2 & \dots &\lambda_1^n \\ \vdots & \vdots & \ddots & \vdots \\ \lambda_n^1 & \lambda_n^2 & \cdots & \lambda_n^n\end{bmatrix} \begin{bmatrix} v_1 \\ \vdots \\ v_n \end{bmatrix}$$

What is the matrix of coordinate transformation from $\alpha^*$ to $\beta^*$?

We can write $w^j \in \beta^*$ as a linear combination of basis elements in $\alpha^*$:

$$w^j=\mu_{1}^{j}v^1+\cdots+\mu_n^{j}v^n$$

We get a matrix representation $[S]_{\beta^*}^{\alpha^*}=[\mu_{i}^{j}]$ as the following:

$$\begin{bmatrix} w^1 & \cdots & w^n \end{bmatrix}= \begin{bmatrix} v^1 & \cdots & v^n \end{bmatrix}\begin{bmatrix} \mu_1^1 & \mu_1^2 & \dots &\mu_1^n \\ \vdots & \vdots & \ddots & \vdots \\ \mu_n^1 & \mu_n^2 & \cdots & \mu_n^n\end{bmatrix} $$

We know that $w_i = \lambda_{i}^1v_1+\cdots+\lambda_{i}^nv_n$. Evaluating this functional at $w_i \in V$ we get:

$$w^j(w_i)=\mu_{1}^{j}v^1(w_i)+\cdots+\mu_n^{j}v^n(w_i)=\delta_{i}^j$$ $$w^j(w_i)=\mu_{1}^{j}v^1(\lambda_{i}^1v_1+\cdots+\lambda_{i}^nv_n)+\cdots+\mu_n^{j}v^n(\lambda_{i}^1v_1+\cdots+\lambda_{i}^nv_n)=\delta_{i}^j$$ $$w^j(w_i)=\mu_{1}^{j}\lambda_{i}^1+\cdots+\mu_n^{j}\lambda_{i}^n=\sum_{k=1}^n\mu_{k}^j \lambda_{i}^k=\delta_{i}^j$$

But $\sum_{k=1}^n\mu_{k}^j \lambda_{i}^k$ is the $(i,j)$ entry of the matrix product $TS$. Therefore $TS=I_n$ and $S=T^{-1}$.

If we want to write down the transformation from $\alpha^*$ to $\beta^*$ as column vectors instead of row vector and name the new matrix that represents this transformation as $U$, we observe that $U=S^{t}$ and therefore $U=(T^{-1})^t$.

Therefore if $T$ represents the transformation from $\alpha$ to $\beta$ by the equation $\mathbf{w}=T\mathbf{v}$, then $\mathbf{w^*}=U\mathbf{v^*}$.

The important case is when $T=(T^{-1})^t$ which happens if and only if $T$ is an orthonormal transformation.

What I don't understand is why the transformation from $\alpha$ to $\beta$ is called a contravariant transformation while the transformation from $\alpha^*$ to $\beta^*$ is called a covariant transformation. Would you please elaborate on this important point? It's been driving me crazy for the last two days.

Thanks in advance.

math.n00b
  • 3,232
  • 1
    There's a good discussion of this in [this Wikipedia article] (http://en.wikipedia.org/wiki/Covariance_and_contravariance_of_vectors). – Jack Lee Nov 04 '14 at 20:10
  • You don't understand why the transformation between vector bases is given (arbitrary term) while the transformation between covector bases is given (different arbitrary term)? Why do these (arbitrary terms) confuse you, and what does this confusion have with your elaborate post on deriving these transformation laws (that have arbitrary terms used to describe them)? – Muphrid Nov 04 '14 at 21:27
  • 1
    Here a elementary but generalizable explain http://juanmarqz.wordpress.com/2012/02/16/change-of-basis-and-change-of-components/ – janmarqz Nov 05 '14 at 00:57
  • @Muphrid: I don't understand why one is called contra-variant while the other one is called co-variant... and what kind of representation is standard to write down the transformations? Do we work with column vectors or row vectors for both? or not? – math.n00b Nov 07 '14 at 05:51
  • check this elementary notes https://www.math.ethz.ch/education/bachelor/lectures/fs2014/other/mla/ma.pdf – janmarqz Nov 14 '14 at 13:45
  • @math.n00b, did you check the notes from ETH? – janmarqz Dec 24 '14 at 15:05
  • @math.n00b: tell me if http://math.stackexchange.com/questions/1068862/covariant-and-contravariant-components-and-change-of-basis/1082980#1082980 can help to grasp – janmarqz Dec 28 '14 at 02:06

2 Answers2

2

Let us use upper indices for row(-vectors) and lower index for column(-vectors).

Let $x$ be an arbitrary vector, and $[x]$ its coordinates in the standard basis.

Let $V=\left([v_1]\mid [v_2]\mid\ldots [v_n]\right)$ be the matrix of stacked coordinate columns-vectors of the first basis, and similarly $W$ for the second basis $\{w_i\}$.

If the coordinates of a vector in the two bases are related by $$[x]_w=T[x]_v,$$ then, identifying left-most and right-most sides in $V[x]_v=[x]=W[x]_w=W(T[x]_v)=(WT)[x]_v$ we have $$V=WT$$

Let $V^{\prime}=\left(\begin{smallmatrix} [v^1]\\ [v^2] \\ \cdots\\ [v^n]\end{smallmatrix}\right)$ be the matrix of stacked row-vectors of the first dual basis, and similarly $W^{\prime}$ for the second dual basis $\{w^i\}$.

Using the dual basis we can write $[x]_v=\left(\begin{smallmatrix} v^1(x)\\ v^2(x) \\ \cdots\\ v^n(x)\end{smallmatrix}\right)=V^{\prime}[x]$ and similarly for the $[x]_w$.

From $TV^{\prime}[x]=T[x]_v=[x]_w=W^{\prime}[x]$ we have $$W^{\prime}=TV^{\prime}$$

Finally, for any dual vector (co-vector) $\alpha$ we can write $[\alpha]=[\alpha]_{w^{\prime}}W^{\prime}=[\alpha]_{w^{\prime}}TV^{\prime}$ and therefore, $$[\alpha]_{v^{\prime}}=[\alpha]_{w^{\prime}}T$$

rych
  • 4,445
0

If we have some vector $u$, then its components $(u^j)^n_{j=1}$ in the basis $\alpha$ can be found by applying the dual basis $\alpha^*$ to $u$. $$u^j=v^j(u)$$

So when we change from $\alpha$ to $\beta$ the components change by $$u^j=v^j(u)\longmapsto w^j(u)=(S\mathbf{v})^j(u)=\sum_i{\mu^j}_iu^i$$ So the components of vectors change by left multiplication by the matrix $S$ (with components ${\mu^j}_i$).

Similarly, covectors $f$ have components $f_i$ which transform by $$f_i\longmapsto \sum_j{\lambda_i}^jf_j$$ So the components of covectors change by left multiplication by the matrix $T$ (with components ${\lambda_i}^j$).

Components that transform according to $S$ are thought of are changing in the same way as the components for vectors, hence they are co-variant ("varying with").

Components that transform according to $S^{-1}=T$ are transforming in the opposite way and are therefore contra-variant ("varying against").