Linear Algebra
linear algebra, least squares, spectral methods
1 Why This Module Matters
Linear algebra is one of the most important modules on the whole site. Modern ML, optimization, signal processing, control, and statistics all keep returning to vectors, operators, geometry, and spectra.
2 First Pass Through This Module
- Vectors and Linear Combinations
- Matrices and Linear Maps
- Subspaces, Basis, and Dimension
- Orthogonality and Least Squares
- Eigenvalues and Diagonalization
- SVD and Low-Rank Approximation
On a first pass, stay on the concept pages. You should not need proof, application, lab, paper-lab, research, or source-guide pages just to understand the main story of the module.
4 Go Deeper By Topic
4.1 Vectors and Linear Combinations
Start with Vectors and Linear Combinations.
If you want more after the main page:
Proof: Span Is a SubspaceApplication: Vector Mixtures in Embeddings and AttentionVisual intuition: Computation Lab: Linear Combinations and Span GeometryPractice: Exercises: Vectors and Linear Combinations
4.2 Matrices and Linear Maps
Start with Matrices and Linear Maps.
If you want more after the main page:
Proof: Basis Images Determine a Linear MapApplication: Learned Linear Projections in TransformersVisual intuition: Computation Lab: Matrix Composition and Basis ActionPractice: Exercises: Matrices and Linear Maps
4.3 Subspaces, Basis, and Dimension
Start with Subspaces, Basis, and Dimension.
If you want more after the main page:
Proof: Exchange Argument and Why Dimension Is Well DefinedApplication: Low-Dimensional Subspace ModelsVisual intuition: Computation Lab: Basis and Column Space GeometryPractice: Exercises: Subspaces, Basis, and Dimension
4.4 Orthogonality and Least Squares
Start with Orthogonality and Least Squares.
If you want more after the main page:
Proof: Projection Theorem and Normal EquationsApplication: Linear Regression Through ProjectionVisual intuition: Computation Lab: Projection Geometry and Regression ResidualsPractice: Exercises: Orthogonality and Least Squares
4.5 Eigenvalues and Diagonalization
Start with Eigenvalues and Diagonalization.
If you want more after the main page:
Proof: Distinct Eigenvalues Give Independent EigenvectorsApplication: Spectral Modes in Consensus and GraphsVisual intuition: Computation Lab: Matrix Powers and Spectral ModesPractice: Exercises: Eigenvalues and Diagonalization
4.6 SVD and Low-Rank Approximation
Start with SVD and Low-Rank Approximation.
If you want more after the main page:
Proof: Eckart-Young and Truncated SVDApplication: PCA Through SVDVisual intuition: Computation Lab: Rank-1 Approximation and PCA GeometryPractice: Exercises: SVD and Low-Rank Approximation
5 Research Bridge
After the first pass, use the Optional Paper Bridge and Go Deeper sections inside each concept page to open the relevant:
- paper lab
- research direction
- source guide
That keeps the module landing page clean while still making every topic package discoverable from its own home page.
6 What You Want By The End
- geometric intuition
- symbolic fluency
- operator viewpoint
- ability to recognize where a paper is really using spectral structure
7 Application Door
If you want a first application target, start with least squares -> SVD -> PCA -> low-rank approximation. That chain alone will pay off again and again.
8 Sources and Further Reading
- MIT 18.06SC Linear Algebra resource index -
First pass- strong official module spine. Checked2026-04-24. - Stanford Math 51 -
First pass- current applied framing for the linear algebra side of the module. Checked2026-04-24. - Hefferon, Linear Algebra -
Second pass- proof-and-exercise depth for self-study. Checked2026-04-24.