Span Is a Subspace
proof, span, subspace, column space
1 Claim
The first durable theorem after the definition of span is:
once you allow all linear combinations, you automatically get a subspace.
Theorem 1 (Theorem) Let \(v_1,\dots,v_k \in \mathbb{R}^n\). Then
\[ \operatorname{span}\{v_1,\dots,v_k\} \]
is a subspace of \(\mathbb{R}^n\).
2 Why This Proof Matters
This proof is short, but it teaches a pattern that keeps returning:
- column space is a subspace because it is a span
- model classes in regression are subspaces because they are spans of feature columns
- later basis arguments start by showing a set is already closed under linear combinations
So this is really the first proof of closure under the operations we care about.
3 Dependencies
- the span of \(v_1,\dots,v_k\) is the set of all vectors of the form \(a_1 v_1 + \cdots + a_k v_k\)
- a subspace must contain \(0\) and be closed under addition and scalar multiplication
4 Strategy Before Details
Use the definition of span directly.
Take two generic vectors already in the span, write each one with coefficients, and then show their sum and any scalar multiple still have the same form.
5 Full Proof
Proof. Let
\[ W = \operatorname{span}\{v_1,\dots,v_k\}. \]
We check the three subspace conditions.
First, \(0 \in W\) because
\[ 0 = 0v_1 + \cdots + 0v_k. \]
Now take \(u,w \in W\). By the definition of span, there exist scalars \(a_1,\dots,a_k\) and \(b_1,\dots,b_k\) such that
\[ u = a_1 v_1 + \cdots + a_k v_k, \qquad w = b_1 v_1 + \cdots + b_k v_k. \]
Then
\[ u+w = (a_1+b_1)v_1 + \cdots + (a_k+b_k)v_k, \]
which is again a linear combination of \(v_1,\dots,v_k\). So \(u+w \in W\).
Finally, let \(c \in \mathbb{R}\) and let
\[ u = a_1 v_1 + \cdots + a_k v_k \in W. \]
Then
\[ cu = (ca_1)v_1 + \cdots + (ca_k)v_k, \]
which is also a linear combination of \(v_1,\dots,v_k\). Hence \(cu \in W\).
So \(W\) contains \(0\) and is closed under addition and scalar multiplication. Therefore \(W\) is a subspace of \(\mathbb{R}^n\).
6 Step Annotations
- The zero vector appears because span allows all scalar choices, including zero coefficients.
- The sum of two linear combinations is still a linear combination because the coefficients simply add.
- Scalar multiplication stays inside the span because the scalar can be absorbed into each coefficient.
7 Why The Assumptions Matter
- The theorem works in \(\mathbb{R}^n\), but the same proof works in any vector space.
- The span uses
alllinear combinations. If you restricted coefficients, for example to be nonnegative or to sum to \(1\), you would usually get a cone or affine set instead of a subspace.
8 Common Failure Modes
- proving closure under addition but forgetting scalar multiplication
- saying “it is obvious” without writing the generic coefficient form
- confusing span with convex combinations
9 Reusable Pattern
When a set is defined by all linear combinations of some generators, the first proof move is almost always:
- write two generic elements with coefficients
- add them or scale them
- show the result is still the same kind of expression
10 Where This Shows Up Again
- Subspaces, Basis, and Dimension
- Orthogonality and Least Squares
- any proof that column space, null space, or solution sets of homogeneous systems are subspaces
11 Exercises
- Adapt the proof to show that the column space of a matrix is a subspace.
- Give an example of a set built from vectors that is closed under addition but not scalar multiplication.
- Explain why affine combinations do not usually form a subspace.
12 Sources and Further Reading
- MIT 18.06SC: Basis and Dimension -
First pass- official source for span, basis, and dimension as one storyline. Checked2026-04-24. - Hefferon, Linear Algebra -
Second pass- stronger exercise depth for span and subspace arguments. Checked2026-04-24. - Deep learning, transformers and graph neural networks: a linear algebra perspective -
Paper bridge- modern reminder that many learned representation spaces are still organized around spans and subspaces. Checked2026-04-24.