9

Given a set $S$ and a field $F$ we can construct the $F$-free vector space over $S$ in the following way.

Consider the set of formal sums $$FS:=\left\{\sum_{s\in S} \alpha_s s\,:\, \alpha_s=0\, \text{except for a finite number of}\, s \in S\right\}.$$ The structure of an $F$-vector space is given to $FS$ by using addition and multiplication in $F$, ie $$\sum_{s \in S} \alpha_s s+\sum_{s \in S} \beta_s s = \sum_{s \in S}(\alpha_s+\beta_s) s,$$$$\alpha\left(\sum_{s \in S} \alpha_s s\right) :=\sum_{s\in S}(\alpha \alpha_s)s.$$

$FS$ is called free vector space over $S$.

The element of $FS$ for wich $\alpha_s=1$ and $\alpha_r=0$ if $r\neq s$ is identified with $s$. This identification embeds $S$ in $FS$ and allow us to consider $S$ as a set of generators for $FS$.

In fact, by definition, every element of $FS$ can be written as a linear combination of element of $S$. My question is the following: how can I prove that $S$ is a basis? I mean, how can I prove linear independence?

I think we have to add the following condition on $FS$: given $a=\sum_{s \in S} \alpha_s s, b=\sum_{s \in S} \beta_s s$ in $FS$ then $$a=b\,\text{iff}\, \alpha_s=\beta_s \, \text{for all}\, s \in S.$$ (in this way, linear independence is trivial).

Is this condition necessary or not to prove linear independence for $S$?

Thanks a lot in advance.

eleguitar
  • 516

3 Answers3

10

You are correct that the condition that $a=b$ if and only if $\alpha_s = \beta_s$ for all $s \in S$ is required. This is wrapped up in definition of a 'formal sum', so I don't think there's any reason why you'd need to spell it out explicitly.

If you wanted to be extremely precise, you could say that the underlying set of $FS$ is the set of functions $a : S \to \mathbb{R}$ such that $\alpha(s) = 0$ for all but finitely many $s \in S$. Writing $\alpha_s = a(s)$ and $\beta_s = b(s)$, the condition that $a=b$ if and only if $\alpha_s=\beta_s$ for all $s \in S$ now follows from the definition of a function, and then you can simply identify the formal sum $a = \sum_{s \in S} \alpha_s s$ with the corresponding function $a : S \to \mathbb{R}$.

(If your vector spaces are over a field $k \ne \mathbb{R}$, that's still fine: just replace $\mathbb{R}$ by $k$ in the previous paragraph.)

  • Thanks a lot man! Your answer was illuminating to me. Then, if I well understood, I can use the following definition: "Let $F$ a field and $S$ a set. The free $F$-vector space over $S$ is the vector space over $F$ whose underlying set is $$FS={\sum_{s\in S}^{}\alpha_s s,:, \alpha_s=0,\text{for all but finitely many},s\in S},$$ with the following condition: given $a=\sum_{s\in S}^{}\alpha_s s, b=\sum_{s\in S}^{}\beta_s s$ in $FS$ then $a=b$ iff $\alpha_s=\beta_s$ for all $s \in S$. The structure of $F$-vector space is given to $FS$ by using addition and multiplication of $F$, ie – eleguitar Sep 15 '18 at 09:31
  • $$\sum_{s\in S}^{}\alpha_s s+\sum_{s\in S}^{}\beta_s s:=\sum_{s\in S}^{}(\alpha_s+\beta_s) s,$$$$\alpha \sum_{s\in S}^{}\alpha_s s:=\sum_{s\in S}^{}(\alpha\alpha_s) s.$$" – eleguitar Sep 15 '18 at 09:32
3

A basis of a vector space $V$ is a set $B \subseteq V$ such that for any vector space $W$ and a function $f : B \to W$ there exists a unique linear map $\tilde{f} : V \to W$ which extends $f$.

In your case, let $W$ be a vector space and $f : S \to W$ a function. If a linear map $A : FS \to W$ extends $f$ then by linearity we have

$$A\left(\sum_{s \in S} \alpha_s s\right) = \sum_{s \in S}\alpha A(s) = \sum_{s \in S} \alpha_sf(s)$$

Hence, the only candidate for $\tilde{f} : FS \to W$ is given by $\tilde{f}\left(\sum_{s \in S} \alpha_s s\right) = \sum_{s \in S} \alpha_sf(s)$. This map is indeed linear:

\begin{align} \tilde{f}\left(\lambda\sum_{s \in S} \alpha_s s + \mu\sum_{s \in S} \beta_s s\right) &= \tilde{f}\left(\sum_{s\in S}(\lambda\alpha_s +\mu\beta_s)s \right)\\ &= \sum_{s\in S}(\lambda\alpha_s +\mu\beta_s)f(s)\\ &= \lambda \sum_{s \in S} \alpha_s f(s) + \mu\sum_{s\in S}\beta_s f(s)\\ &= \lambda\tilde{f}\left(\sum_{s \in S} \alpha_s s\right) + \mu\tilde{f}\left(\sum_{s \in S} \beta_s s\right) \end{align}

Therefore $S$ is a basis for $FS$.

mechanodroid
  • 47,570
0

For clarity, I will put square-brackets around the elements of $FS$.

Suppose we are given a finite linear combination of elements of $S$:

$$ \sum_{s \in S} c_s [s] $$

(meaning that all but finitely many $c_s$ are zero). Then by inductively applying the definition of the vector space operations, we can show

$$ \sum_{s \in S} c_s [s] = \left[ \sum_{s \in S} c_s s \right]$$

If we have a linear dependence

$$ \sum_{s \in S} c_s [s] = 0 $$

then we must have

$$ \left[\sum_{s \in S} c_s s \right] = \left[\sum_{s \in S} 0 s\right]$$ and thus $c_S = 0$ for all $s$.