3

I am having trouble understanding a proof given in Dummit and Foote relating to the basis of $\bigwedge^k(V)$, the exterior $k$th power of a vector space. Let $V$ be a vector space over the field $F$ with basis $B=\{v_1\dots v_n\}$. Then the vectors $v_{i_1}\wedge v_{i_2}\wedge\cdots\wedge v_{i_k}$ for $1\leq i_1\leq\cdots i_k\leq n$ are a basis of $\bigwedge^k(V)$.

In particular, I'm having trouble with the part of the proof that shows these vectors are linearly independent. The proof says that to show these vectors are linearly independent it suffices to exhibit an alternating $k$-multilinear function from $V^k$ to $F$ which is $1$ on a given $v_{i_1}\wedge v_{i_2}\wedge\cdots\wedge v_{i_k}$ and zero on all other generators.

Why would this show that the vectors are linearly independent? And what does "1 on a given $v_{i_1}\wedge v_{i_2}\wedge\cdots\wedge v_{i_k}$" mean when this function is defined on $V^k$?

ponchan
  • 2,838

2 Answers2

1

Write $I=(1 \le i_1 <\cdots < i_k \le n)$ and $J=(1 \le j_1 <\cdots < j_k \le n)$ for strictly ascending multi-indices of length $k$. Finally, write $\alpha^I$ for $\alpha^{i_1} \wedge\cdots \wedge\alpha^{i_k},$ where $(\alpha^j)^n_{j=1}$ is the dual basis for $V$.

Then, it follows from the definition of the wedge product, that

$\alpha^I(v_J) = \det[\alpha^i(v_j)]_{i\in I, j\in J} .$

If $I=J$, the determinant is equal to $1$ because in this case, $[\alpha^i(v_j)]_{i\in I, j\in J}$ is the identity matrix.

If $I \neq J,$ then let $1\le l\le k$ be the least integer such that $i_l\neq j_l$. Without loss of generality, assume $i_l<j_l$. Then, $i_l$ can not be equal to any of $j_1, \cdots, j_{l−1}$ because these are all equal to $i_1, \cdots, i_{l−1},$ respectively and $I$ is a $strictly$ increasing sequence. Similarly, $i_l$ is also different from $j_l, j_{l+1}, \cdots , j_{k}$, because $J$ is a strictly increasing sequence. So, by definition of the action of the dual basis on the basis for $V$, the $l^{\text{th}}$ row of the matrix is zero and so the determinant is zero.

The upshot of this is that $\alpha^I(v_J) =\delta_J^I$, and now it is easy to prove linear independence: if $\sum_I c_I\alpha^I = 0$ for some real numbers $c_I,$ where $I$ runs over all strictly increasing multi-indices of length $k$, then, $\sum_I c_I\alpha^I(v_J) = \sum_I c_I\delta_J^I =c_J=0.$

Matematleta
  • 30,081
0

For the following reason, suppose you had a set $B= \{ \omega_i\in\bigwedge^k(V) : i\leq n \} $ and suppose your hypothesis is true. If $B$ was linealy dependent, then $\exists \lambda_1\ldots\lambda_n$ not all $0$ so that $$ \sum_{i=1}^{n}\lambda_i\omega_i=0$$ (*)

Suppose WLOG that $\lambda_1\neq0$ (otherwise, just rearrange) and set $v\in V^{k}$ so that $\omega_1(v)=1$ and $\omega_j(v)=0,\forall j\geq 2$

Then, plugging $v$ in (*) you get $\lambda_1=0$. A contradiction!

miraunpajaro
  • 1,590