3

For context, I am reading P.R Halmos's Finite-Dimensional Vector Spaces's section on linear dependence. The book wrote a lot of explanation for why the empty set is linearly independent around the definition of linear dependence

Here's the definition provided in the text:

Definition. A finite set $\{x_i \}$ of vectors is linearly dependent if there exists a corresponding set $\{a_i \}$ of scalars, not all zero, such that $$\sum_i a_i x_i = 0 $$ If, on the other hand, $\sum_i a_i x_i = 0 $ implies that $a_i = 0 $ for each $ i $, the set $\{x_i \}$ is linearly independent

And the explanation for why the empty set is linearly independent as I've understood is as follows: Since there is no indices $ i $ at all for an empty set, you cannot assign to some of them a non-zero scalar, thus it's not linearly dependent.

But what I'm confused about is that the negation of "some scalars are non-zero" is "all scalars are zero". Then I can use the same argument to say that since there is no indices $ i $ at all for an empty set, you cannot assign to all the vectors a zero scalar, thus it's not linearly independent.

Especially when the text, for sake of intuition, tries to rephrase the definition of linear independence to "If $\sum_i a_i x_i = 0 $ then there is no index $ i $ for which $ a_i \neq 0 $". Here, equivalently, we can say "If $\sum_i a_i x_i = 0 $ then for all indices $ i $ , $ a_i = 0 $". I feel like this is just playing with words and did not address the problem

  • 2
    Perhaps a better wording is that the set is linearly dependent if there exists a non-trivial linear combination of the vectors in the set, whose sum is the zero vector. If the set is empty, then there does not exist any linear combination of the vectors, because there are no vectors. – user2661923 Sep 05 '22 at 23:51
  • 1
    I wonder if it might be more clear to use the following definition of linear independence. Let $V$ be a vector space over a field $F$ and let $S$ be a subset of $V$. To say that $S$ is linearly independent means that if $x_1, \ldots, x_n$ are distinct vectors in $S$ and $a_1, \ldots, a_n \in F$ and $\sum_{i=1}^n a_i x_i = 0$ then $a_i = 0$ for $i = 1, \ldots, n$. If $S$ is the empty set, then this definition is satisfied, because you can't find vectors $x_1, \ldots, x_n \in S$ which violate the condition (indeed, you can't find any vectors in $S$ at all). – littleO Sep 05 '22 at 23:52
  • 5
    It's always a little awkward trying to explain when something is vacuously true and I agree that they didn't really address it properly here. Basically because there are no linearly dependent vectors, the empty set is linearly independent. – CyclotomicField Sep 05 '22 at 23:56
  • 2
    For intuition: A set of vectors is linearly dependent iff there exists one that is a linear combination of the others. Surely in an empty set there doesn't exist such a vector. –  Sep 06 '22 at 00:08

3 Answers3

3

Vacuously. The condition for linear independence is phrased as an "if-then statement". In the case of the empty set, the "if part" is never met. That's it's false. When the if part of an if-then statement is false, the statement is true.

Basic logic: $a\implies b$ is true whenever $a$ is false. (It's only false when $a$ is true and $b$ false.

Thus the empty set is "vacuously" linearly independent.

  • However, I do not believe that the condition is false. the text also previously defined that "when there are no indices $ i $ to be summed over ... The value of such an 'empty sum' [ $ \sum_i x_i $ ] is defined, naturally enough, to be the vector $ 0 $ " – Combinatora Sep 06 '22 at 00:33
  • I guess this gets you into foundations of logic and what not. You may end up giving Gödel a run for his money. Perhaps you would like to challenge the law of "excluded middle" as well. Still, it's kind of hard hard to say that you have a linear combination equal to zero, when there are no vectors. – suckling pig Sep 06 '22 at 00:55
  • @Combinatora, in an empty sum , all coefficients are zero. They are also all non-zero, but that is not important here. – Carsten S Sep 06 '22 at 08:12
2

Let's phrase things differently:

Let $V$ be a vector space over the field $F$, and $S$ a subset of $V$.

$S$ is linearly independent if, $\forall \{a_i\}_{i=1}^n \subseteq F$, $\forall \{v_i\}_{i=1}^n \subseteq S$ (each distinct), then

$$\sum_i a_i v_i = 0 \implies a_i = 0 \; \forall i$$

Consequently, the negation: $S$ is linearly dependent if $\exists \{a_i\}_{i=1}^n \subseteq F$ and $\exists \{v_i\}_{i=1}^n \subseteq S$ (each distinct) such that

$$\sum_i a_i v_i = 0 \text{ and } \exists i \text{ such that } a_i \ne 0$$

Notice what's going on here: to have linear dependence, we need to be able to find

  • a specific vector, or set thereof
  • corresponding scalar(s)

such that $\sum a_i v_i = 0$ and the $a_i$ are not all zero.

But there's a problem with that if $S$ is the empty set -- you can't find any vectors in there!

So you can't conclude linear dependence. Hence the result follows.

PrincessEev
  • 50,606
  • In your definition of linear independence for the set $S$, you need to assume that the $v_i$ are distinct vectors in $S$. Otherwise if $S$ is nonempty, I can take $v_1=v_2$ in $S$ and $1v_1+(-1)v_2=0$, so $S$ is not linearly independent. – blargoner Sep 06 '22 at 00:56
  • Your answer gave me some ideas, and I reviewed more about Quantifiers and the empty set, I think I understand what you mean and figured it out. Apparently, it doesn't matter what the statement says, as long as it has a universal quantifier for an empty set, it's true. And if it has an existential quantifier for an empty set, it's false – Combinatora Sep 06 '22 at 01:47
  • Some links that I found helpful:
    1. https://math.stackexchange.com/questions/281735/quantification-over-the-empty-set
    2. https://www.youtube.com/watch?v=WLI1yzvK_5w
    – Combinatora Sep 06 '22 at 02:52
  • 1
    Yeah, essentially. "For all" can be taken even as meaning "for all zero elements"; "existence" strictly implies the existence of something in there. I like to think of vacuous logic in the sense of "you can't prove me wrong". If I made the assertion "all unicorns are blue", restricting our concerns to reality, my statement could be seen as true -- you can't find me a unicorn which *is not* blue. (The set of unicorns, in reality, is empty.) It's not the most rigorous analogy or explanation and doesn't mesh well with everyone though, so your mileage may vary. – PrincessEev Sep 06 '22 at 03:16
  • I am bit confused with your reasoning. Here is the definition of linearly dependent vectors: Let $h_1,\dots,h_m\in V$. Then ${h_1,\dots,h_m}$ is linearly dependent $\Leftrightarrow \exists \ell_1,\dots,\ell_m\in \mathbb{R} \ ((\exists i\in [m] \ \ell_i\neq 0) \ \land \ (\ell_1\cdot h_1+\dots+\ell_m\cdot h_m=0) )$, which is correct I believe. I don't see any contradiction. I mean $\varnothing$ is linearly dependent due to this definition. – RFZ Oct 31 '23 at 15:47
  • How so? You can't find an $i \in [0]$ to begin with, much less one with $i \ne 0$. – PrincessEev Oct 31 '23 at 15:57
1

Say that a "dependence relation" for a finite set of vectors is a linear combination of elements from the set that is equal to zero. As you say, the empty set of vectors satisfies a unique dependence relation, namely, the empty sum is equal to zero. In this dependence relation, there are no coefficients. Therefore, all the coefficients are equal to zero. In other words: in every dependence relation satisfied by the empty set, all coefficients are equal to 0. This is the definition of linear independence.

JBL
  • 296
  • 1
  • 4
  • but by your argument, since there are no coefficients, I can also set the coefficients equal to numbers other than zero. And it is just as correct as assigning all the coefficients as equal to zero. – Combinatora Sep 06 '22 at 02:00
  • @Combinatora Yes, it is true (but irrelevant) that all the coefficients (all 0 of them) are equal to 17. But linear dependence requires the existence of a set of coefficients not all of which are equal to 0; the fact that all 0 coefficients are equal to 17 has no bearing on the fact that all 0 coefficients are equal to 0. – JBL Sep 06 '22 at 11:18
  • In other words: when you have 3 things, the fact that one of them is equal to 17 means that not all of them are equal to 0. But when you have 0 things, the fact that all of them are equal to 17 doesn't preclude the fact that all of them are also equal to 0 (because there aren't any of them). – JBL Sep 06 '22 at 11:21
  • Oh, I think I now understand. You are trying to explain more intuitively right? What's really happening here is that we have a for all quantifier on an empty set, so whatever statement after that must be true. Thus, we don't even care whether the coefficients are equal to zero or not – Combinatora Sep 06 '22 at 23:34