7

My professor claims that there is no such thing as a vector without a basis, but I claim that there exists "raw" vectors.

For example, let's say you have a basis in $\mathbb{R}$:

$$\beta = \{\langle 1, 0 \rangle, \langle 0, 1 \rangle\}$$

Then the vectors in $\beta$, are they with respect to a basis, or are they "raw" vectors?

Perhaps my terminology is not correct, but I hope my meaning is clear, that vectors can exist without a basis, for example $m \in M_{2 \times 2}(\mathbb{F})$. I think you would write it as a coordinate vector if it were in terms of a basis, and as a matrix in "raw" format.

  • 17
    Absolutely, the definition of a vector doesn't rely on a basis – David Raveh Jun 09 '23 at 22:38
  • 7
    Vectors can exist regardless of whether a basis is specified, but how you represent them definitely can be dependent on the basis chosen. Kind of like how you know the number $10$ exists, but you can write it as $9.999\cdots$, or can write it in base $2$ as $1010_2$, or so on. – PrincessEev Jun 09 '23 at 22:40
  • 3
    A vector is an element of a vector space, with properties of closure under addition and scalar multiplication, for example. Considering (1, 0) as some object on its own does not look terribly useful – Paul Jun 09 '23 at 22:45
  • It is true that there is no such thing as a vector space without a basis. Could this be what your professor actually meant? – Dan Asimov Jun 10 '23 at 00:53
  • 9
    Basis is an attribute not of vector, but of vector space. Phrases like “a vector without basis” or “a vector with a basis” are meaningless. – user1551 Jun 10 '23 at 01:29
  • You may want to watch Edward Frenkel's take on it. – Rodrigo de Azevedo Jun 10 '23 at 06:58
  • 2
    A basis is made of vectors. So we would be in egg-and-chicken territory if a vector required a basis. – Stef Jun 10 '23 at 14:32
  • So when we present a basis, we do so in terms of the "vectors", without any notion of basis, correct? – user129393192 Jun 11 '23 at 18:25
  • @user129393192: Yes, that is correct. – Joe Jun 11 '23 at 21:37

8 Answers8

4

In Euclidean (e.g. plane) geometry: observe pairs of points $(A,B)$ in a plane and define that the pairs $(A,B)$ and $(C,D)$ are equivalent if there exists a translation which maps $A$ to $B$ and $C$ to $D$. (A translation is a composition of two reflections against two parallel, not necessarily distinct, lines.) Now, being equivalent is an equivalence relation on all pairs of points, so now we can observe equivalence classes. Label the equivalence class of the pair $(A,B)$ as $\vec{AB}$ and here you go, you've defined vectors in geometry without ever mentioning coordinates!

  • This feels like cheating because it seems so coordinates, are indirectly baked in when you write it as a pair of points – Clemens Bartholdy Jun 10 '23 at 09:41
  • 4
    @HopefulWhitepiller Okay, I should've said what geometry I am talking about. Say, Hilbert's axiomatization, or any other equivalent axiomatization Euclidean style. I am not sure how coordinates would be baked there. (You are probably aware that axiomatizations of geometry have been around since ancient Greek times, while the link to the numbers via coordinates was only established by Descartes.) –  Jun 10 '23 at 10:15
3

You can certainly present vector spaces, and vectors in them, without any basis.

For instance, take a vector $\newcommand{\x}{\mathbf{x}}\newcommand{\y}{\mathbf{y}} \x \in \newcommand{\R}{\mathbb{R}}\R^n$, and take its orthogonal complement, $V_x := \{ \y \in \R^n \mid \x \cdot \y = 0 \}$. Now $V_x$ is a subspace of $\R^n$, so it’s a vector space, but I haven’t given you a basis for $V_x$, and finding one is non-trivial — but it’s also unnecessary for working in $V_x$ and reasoning about it.

But some basis will always exist — this is an important theorem of classical algebra — and once you choose a basis, every vector can be expressed using it.

So you’re certainly right that you can give a vector space, and vectors in it, without reference to any basis, and you can do lots of work with vector spaces just fine without using a basis. Working this way is crucial for lots of applications in modern algebra, especially in differential geometry/topology. It’s good to learn how to work in a basis-free way.

On the other hand, I guess your professor may have meant “There is no vector space that doesn’t have a basis” — in which case they’re also right. Bases will always exist, even if you’re not using them or looking for them, and vectors will always have basis representations.

  • 2
    This answer is only true for finite-dimensional vector spaces. If the vector space has infinite dimension then whether it has a basis depends on the axiomatic foundation you choose for doing mathematics. – David A. Craven Jun 10 '23 at 16:08
  • 2
    @DavidA.Craven I work a lot in constructive maths myself, so I appreciate that subtlety and considered mentioning it — but I think it’s a bit of a side point, since taking the axiom of choice as given is pretty much the mainstream standard among mathematicians. – Peter LeFanu Lumsdaine Jun 10 '23 at 21:11
1

A vector $x$, by definition, is an element of a vector space $V$ (together with a definition of addition and multiplication by scalar), which is a set of elements that satisfy the basic properties that if $x$ and $y$ are elements, then so is $x+y$, as well as commutative and associative properties etc. The trivial vector space $\{0\}$ satisfies all of these properties. So does the real line.

A basis, once chosen, allows us to uniquely represent our vectors in a neat fashion: if $x$ and $y$ have representations $\{1,0\}$ and $\{0,1\}$, then the vector $x+y$ has the representation $\{1,1\}$.

A basis is also useful for linear transformations. A linear transformation maps element $x$ in $V_1$ to element $y$ in $V_2$. But if we have bases for $V_1$ and $V_2$, then we may represent the linear transformation using a matrix, which describes the transformations of the basis elements. So a basis is a convenient tool that allows us to perform computation in a neat and simple fashion.

David Raveh
  • 1,876
1

The notion of vector arises when the set in subject is a vector space in itself. Each element of a vector space $V$ (on satisfiability of certain axioms) is called as vector. The set $\beta=\{(1,0),(0,1)\}$ being a subset of vector space $\mathbb R^2$ serves as a basis of it so that each vector $v\in\mathbb R^2$ can be expressed as the linear combination of elements of basis $\beta$.

Moreover, contrary to your claim the vector $(1,0)$ too is expressible as $(1,0)={\color{red}1}(1,0)+{\color{red}0}(0,1)$ i.e. the linear combination of the vectors in basis $\beta$.

Nitin Uniyal
  • 8,108
1

In this answer, for the sake of simplicity I restrict my attention to finite-dimensional real vector spaces. However, all of the ideas generalise straightforwardly to a finite-dimensional vector space over an arbitrary field.

In linear algebra, the term "vector" is an informal term used to refer to an element of a given vector space. For instance, if our vector space is $\mathbb R^2$ together with the usual vector addition and scalar multiplication, then a "vector" is simply an ordered pair of real numbers $(a,b)$. These basic notions do not involve bases at all.

However, given an $n$-dimensional vector space $V$ (where $n\in\mathbb N$), it is possible to represent the elements of $V$ with $n$-tuples of real numbers. For example, consider the set $V$ of real polynomials taking the form $ax^2+bx+c$. There are natural notions of vector addition and scalar multiplication that allow us to regard $V$ as a vector space. Moreover, $V$ has the (ordered) basis $\{x^2,x,1\}$. With this basis, we can represent polynomial $ax^2+bx+c$ with the ordered triple $(a,b,c)$.

In the general case, if $\{v_1,\dots,v_n\}$ is an (ordered) basis of $V$, then every $v\in V$ can be written as $v=a_1v_1+\dots+a_nv_n$ where $a_1,\dots,a_n\in\mathbb R$. The coefficients $a_1,\dots,a_n$ are uniquely determined, meaning that there is a well-defined map $\varphi:V\to \mathbb R^n,v\mapsto (a_1,\dots, a_n)$ (note however that the coefficients $a_1,\dots,a_n$ are dependent on $v$, so we should write something like $a_{v,1},\dots,a_{v,n}$ if we were to use more pedantic notation). It is customary to call $(a_1,\dots,a_n)$ the coordinates of $v$ with respect to the basis $\{v_1,\dots,v_n\}$. For example, if $V$ is the set of polynomials as above, then the coordinates of $ax^2+bx+c$ with respect to the basis $\{2x^2,-x,5\}$ is $(a/2,-b,c/5)$. This example also serves to illustrate that, in general, there is not always a "natural" or "obvious" correspondence between a vector $v$ and the coordinates of $v$ with respect to a certain basis.

If $V=\mathbb R^n$, then we have the slightly confusing situation where the coordinates of a vector $v\in\mathbb R^n$ is another element of $\mathbb R^n$. This means that, in contrast, to the general case, the vectors and the coordinates of vectors "live in the same space". For instance, if $\mathbb R^2$ is our vector space and $\{(1,2),(2,5)\}$ is our basis, then the coordinates of $(4,10)$ is $(0,2)$. However, if our basis is $\{(1,0),(0,1)\}$, then the coordinate of every vector is itself. This is why the basis $\{(1,0),(0,1)\}$ is called the "standard", "canonical", or "natural" basis. A priori, it seems unclear why we would want to represent the representations of vectors in $\mathbb R^n$ in anything other than the standard basis. However, it turns out that in certain situations, changing the basis makes computations like matrix multiplication much easier.

Joe
  • 22,603
  • Thanks for the very clear response. My only question is you mention ordered and unordered bases. When you say the standard basis ${(0,1), (1,0)}$. Is this equivalent to the basis ${(1,0),(0,1)}$ (which I am used to calling the standard basis). In class, we learned a basis is an ordered set, so would these be different bases? It seems that in practice, yes, because they create very different coordinate vectors (which I now understand is just another vector in $\mathbb{R}^2$. – user129393192 Jun 11 '23 at 18:55
  • @user129393192: The standard basis is ${(1,0),(0,1)}$, and in context this should be considered different to ${(0,1),(1,0)}$; my answer contained a mistake that I have now edited out. In linear algebra, there tends to be lots of ambiguity about what the formal definition of a basis "is": is it an (unordered) set, an indexed family, a multiset, or a "list" of vectors? Different authors use different conventions. – Joe Jun 11 '23 at 19:56
  • The problem with simply saying that a basis is a set of vectors is that, as sets, ${(1,0),(1,0),(0,1)}={(1,0),(0,1)}$, so ${(1,0),(1,0),(0,1)}$ ought to be a basis of $\mathbb R^2$, but few people want to say that. Note that saying "${(1,0),(0,1)}$ is an ordered basis" is clearly an abuse of notation/language, since the notation ${\dots}$ ought to be reserved for (unordered) sets. However, one learns to tolerate such abuses. – Joe Jun 11 '23 at 19:56
  • I see. I prefer the pedantic way of things. How would you define a basis pedantically? Ordered, unordered? A set, not a set? – user129393192 Jun 11 '23 at 19:59
  • @user129393192: It's actually quite hard to do that in a way that conforms with how people like to think of "linearly independent" and "spanning" collection of vectors. Probably the best way is to use "multisets" – these are like sets in that they are unordered, but unlike sets in that duplicates are accounted for. This allows us to make statements such as ${(1,0),(1,0),(0,1)}$ is linearly dependent, which is what we want. However, multisets do not see much usage in mathematics, largely for historical reasons. Now, by default, the term "basis" does not involve any ordering. – Joe Jun 11 '23 at 20:34
  • (At least, I think that's the most common convention.) If you want to speak of ordered bases, then you can define "linear independence" and "span" for functions $f:I\to V$, where $I$ is set of the form ${1,2,\dots,n}$. Then, ${(1,0),(0,1)}$ actually refers to the function where $f(1)=(1,0)$ and $f(2)=(0,1)$, meaning that it is distinct from ${(0,1),(1,0)}$. It's pretty hard to write anything in linear algebra without at least a little abuse of notation! – Joe Jun 11 '23 at 20:34
  • I think you meant for your example that it would be $f(1) = (1, 0, ..., 0)$ correct? Unless I am misunderstanding. I've found this abuse of notation often and it makes me uncomfortable, because personally, I am learning more maths because I love the notion of being precise. – user129393192 Jun 11 '23 at 21:14
  • @user129393192: I meant to write $f(1)=(1,0)$, since I was writing specifically about the standard basis of $\mathbb R^2$. Of course, in the general case of $\mathbb R^n$, we have $f(1)=(1,0,\dots,0)$. I understand your concerns about abuse of notation; I too also found it difficult to stomach the imprecisions that mathematicians indulge in. – Joe Jun 11 '23 at 21:29
  • However, the alternative is much worse. Note that even saying something like "let $V$ be a vector space" is an abuse of language, for two reasons: strictly speaking, there is no such thing as a vector space – instead, for every field $K$, there is such a thing as a "vector space over $K$"; second, a vector space is a not a set $V$: it is a set together with maps $V\times V\to V$, $K\times V\to V$ that satisfy various axioms. So really, a vector space over $K$ is a triple $(V,+,\cdot)$, where $+:V\times V\to V$ and $\cdot:K\times V\to V$ satisfy various axioms. – Joe Jun 11 '23 at 21:29
  • But, now, there are even more abuses here: why are we using the notation $+$ when this already has a meaning in the context of addition of real numbers? Also, it is standard in set theory to define a triple $(a,b,c)$ as the ordered pair $(a,(b,c))$, and an ordered pair $(x,y)$ as the set ${{x},{x,y}}$. So maybe a vector space over $K$ is a set after all! As you can see, you will go insane if you don't learn to tolerate at least a little abuse of notation. – Joe Jun 11 '23 at 21:29
  • For the first thing you stated, you stated you're mapping from ${1, ..., n}$ and defined $f(1)$ and $f(2)$ to map to ordered doubles, which was why I thought you had meant to map to $\mathbb{R}^n$, since the set you were mapping from was not just ${1, 2}$. – user129393192 Jun 11 '23 at 21:48
  • Also, thank you for your responses. Yours was what I was looking for, and I will likely end up accepting. I wonder if you might also have something to add here – user129393192 Jun 11 '23 at 21:50
  • @user129393192: For the first thing, what I meant was: for every $n\in\mathbb N$ and vector space $V$, we can define what it means for a function $f:{1,\dots,n}\to V$ to be "linearly independent" and what it means to be "spanning". Example: to say that ${(1,0),(0,1)}$ is a basis of $\mathbb R^2$ really means that the function $f:{1,2}\to \mathbb R^2$ given by $f(1)=(1,0)$ and $f(2)=(0,1)$ is linearly independent and spans $\mathbb R^2$. So, in my second sentence, I was giving a specific example of a function $f$ which can be said to be "linearly independent" and "spanning". – Joe Jun 11 '23 at 21:58
  • @user129393192: I'm glad I could help. Unfortunately, I don't think I can help you with your second question; it's been a while since I've studied inner product spaces. – Joe Jun 11 '23 at 22:00
  • Thanks again. I just accepted your answer. To reference your thing about functions, I referenced my linear algebra book (Linear Algebra Done Right), and it seems to define $\mathbb{R}^n$ as a map from the set ${1, ..., n}$ to the real-numbers, which was why I was asking about your specific map, since I would've thought you would just map from ${1, 2}$ – user129393192 Jun 12 '23 at 07:19
  • @user129393192: That's not the only way to formally define $\mathbb R^n$, but it's a valid way. Anyway, I should have mentioned that the idea I had in mind was to define linear independence and span for indexed families of vectors. Although indexed families are technically the same thing as functions in set theory, in practice we think of them as sets where the elements have been "tagged" with an index. – Joe Jun 12 '23 at 16:11
  • For instance, we intuitively think of the family $(v_i)_{i\in{1,\dots,n}}$ as meaning the collection of vectors $v_1,v_2,v_3,\dots,v_n$, where we allow for the possibility that $v_i=v_j$ even if $i\neq j$. Each vector $v_i$ has an associated index $i$. This allows us to define ordered bases, since the order that the vectors are intended to be written in can be retrieved by looking at the ordering of the indices. Formally, the family which I have used as an example is simply the map ${1,\dots, n}\to V$, $i\mapsto v_i$. – Joe Jun 12 '23 at 16:11
0

Whenever you write down numbers to represent a vector, you are inherently using a basis; you should think of a basis as nothing more than a way of fixing units.

Let's take a simple example in $\mathbb{R}$. After fixing an origin to call the origin on the real line, you next want to describe the relation a point $p$ is from the origin. Colloquially, you might $p$ is either

  1. 5 kilometers
  2. 5000 meters

from the origin. In the first statement is equivalent to picking the basis for $\mathbb{R}$ as the point which is 1 kilometer away from the origin; the second equivalent to picking the basis for $\mathbb{R}$ as the point which is 1 meter from the origin.

Of course, the point $p$ exists outside of the choice of units, so indeed it exists as a "raw vector". However, if you want to assign any numerical value to it, this is equivalent to choosing a basis.

Mr. Brown
  • 1,867
-1

Yes, but it doesn't matter

Every finite dimensional vector space (what you encounter in basis LA courses) admits a basis. Hence thinking of a vectors through defined by a finite spanning set or thinking of them as their own thing doesn't make much of a difference. Heck, I'd argue that the profs view point maybe be more practical (albeit wrong) since you could do some proofs easier by assuming a basis.

Disregarding that, in the formal definition of a vector space, we have two sets and two binary operations. The two sets are vectors and the scalars, the binary operations are how we can combine vetors, and secondly, how scalars combine with vectors.

So, the vectors in this definition, would be defined by the set itself. For example, you could think of $\mathbb{R^2}$ as everything that could be spanned by the basis $\{ (0,1) , (1,0 ) \}$ , or, by the set of pairs of real numbers.

-1

@MISC {4716963, TITLE = {Is there such a thing as a vector without a basis?}, AUTHOR = {Mr. Brown (https://math.stackexchange.com/users/448217/mr-brown)}, HOWPUBLISHED = {Mathematics Stack Exchange}, NOTE = {URL:https://math.stackexchange.com/q/4716963 (version: 2023-06-11)}, EPRINT = {https://math.stackexchange.com/q/4716963}, URL = {https://math.stackexchange.com/q/4716963} } I agree. It is like in order to take a measurement, there needs to be a unit to which compare your measurement to.