0

In Linear Algebra Done Right, the book proves that every subspace of $V$ is part of a direct sum equal to $V$. I generally follow the proof, but do not understand some points.

Suppose $V$ is finite-dimensional and $U$ is a subspace of $V$. Then there is a subspace $W$ of $V$ such that $V = U\bigoplus W$.

Proof Because $V$ is finite-dimensional, so is $U$ (proved in 2.26. I am OK). Thus there is a basis $u_1,...,u_m$ of $U$. Of course $u_1,...u_m$ is a linearly independent list of vectors in $V$ (I am OK). Hence this list can be extended to a basis $u_1,...,u_m, w_1,...,w_n$ of $V$ (I am not OK. Does it mean $w_1,...w_n$ can only be vectors that extend to the basis of $V$? If so, why the title of the proof said EVERY subspace of $V$ is part of a direct sum equal to $V$).

To prove $V = U\bigoplus W$, we need only show that $$V = U+W \text{ and } U \cap W = \{0\}$$ (I am OK).

To prove the first equation above, suppose $v \in V$. Then, because the list $u_1,...,u_m, w_1,...w_n$ spans $V$, there exist $a_1,...a_m, b_1,...b_m \in \mathbb{F}$ such that $$v=a_1u_1+...+a_mu_m+b_1w_1+...+b_nw_n$$.

In other words, we have $v=u+w$, wheere $u \in U$ and $w \in W$ are defined as above. Thus $v \in U + W$, completing the proof that $V = U + W$. (I am also OK.)

To show that $U \cap W = \{0\}$, suppose $v \in U \cap W$. Then there exist scalars $a_1,...,a_m,b_1,...b_n \in \mathbb{F}$ such that

$$v=a_1v_1+...+a_mv_m = b_1w_1+...+b_nw_n$$. Thus $$a_1u_ + ...+a_mu_m - b_1w_1-...-b_nw_n = 0$$

Because $u_1,...u_m,w_1,...,w_n$ is linearly independent, this implies that $a_1=...=a_m=b_1=...=b_n = 0$. Thus $v=0$, completing the proof that $U \cap W = \{0\}$.

Is the proof means for every subspace $U$ of finite-dimensional $V$, we can find a $W$ that is the direct sum of $V$?

JOHN
  • 1,216
  • 12
  • 29
  • It means that every subspace has a complementary subspace. – Bernard Jan 01 '19 at 23:09
  • I'm not entirely sure what you are confused about but it seems like you are under the impression that $V$ only has one basis. Consider for example $\mathbb{R} ^2$. We have the standard basis ${(1,0),(0,1)}$ but we can replace $(1,0)$ with $(1,1)$ and it is still a basis. In fact replacing $(1,0)$ with anything in the form $(a,b)$ where $a\neq 0$ will result in a basis. ${(1,2),(1,3)}$ is another example of a basis. – Fortox Jan 01 '19 at 23:13

2 Answers2

1

You have a subspace $U$ with basis $u_1,...,u_m$.

Extend the basis to a basis of $V$ by adding vectors $w_1,...,w_n$. There is some freedom to choose the $w_k$ but they must be linearly indepdendent and the collection must span $V$.

Let $W=\operatorname{sp} \{w_k\}$.

Since the whole collection spans $V$, we must have $V = U +W$. If $u\in U, w\in W$ and $u+w = 0$, we must have $u=w=0$ since the whole collection is linearly independent.

Note that $W$ is not unique.

copper.hat
  • 178,207
0

The point is given any set of linearly independent vectors It can be extended to a basis. Now the $u_1,...,u_m$ you took was basis of $U$ and hence it is a set of linearly independent vectors of $V$. Therefore we can extend it to a basis. Suppose we extended and got $u_1,...,u_m,w_1,...,w_n$, obviously $\forall i,w_i \notin U$ because if $\exists j, $ such that $w_j \in U$ then $u_1,...,u_m,w_j$ will form a linearly dependent set.

So, in this Proof they used the extenstion of basis theorem and extended $u_1,..,u_m$ to a basis of V such as $u_1,..,u_m,w_1,..,w_n$ and defined $W=$span$\{w_i:1\leq i\leq n\}$and hereafter it is just trivial verifications.

And answering your final question, Yes this proof means given any vector subspace $U$ of a finite dimensional vector space $V$, $\exists$ a Vector subspace of $V$ namely $W$ such that $V=U \bigoplus W$.