Math for Research
  • Start Here
  • Roadmaps
  • Topics
  • Applications
  • Paper Lab
  • Research
  • Publication
  • Library
  • Notes
  • About
  1. Library
  2. Library
  • Start Here
    • Start Here
    • How to Use This Site
    • Math Diagnostic
    • Notation and Symbols
    • How to Write Proofs
    • How to Read Theory Papers
  • Roadmaps
    • Roadmaps
    • Core Foundation Roadmap
    • Theory Reading Track
    • AI / ML Theory Roadmap
    • Engineering Systems Roadmap
  • Topics
    • Topics
    • Foundation Topics
    • Proofs
      • Proofs
      • Statements and Quantifiers
      • Direct Proof
      • Contrapositive and Contradiction
      • Induction
      • Counterexamples and Proof Debugging
      • Proof-writing Clinic
    • Logic
      • Logic
      • Propositional Logic
      • Logical Equivalence and Negation
      • Predicate Logic
      • Sets, Functions, and Relations
      • Translation Between English and Symbols
    • Algebra Repair
      • Algebra Repair
      • Expressions and Equations
      • Functions and Graph Reading
      • Exponents, Logarithms, and Growth
      • Trig and Complex Numbers
      • Symbol Manipulation Lab
    • Discrete Math
      • Discrete Math
      • Counting and Combinatorics
      • Recurrences and Asymptotics
      • Graphs and Trees
      • Discrete Probability Bridge
      • Number Theory Basics
    • Single-Variable Calculus
      • Single-Variable Calculus
      • Limits and Continuity
      • Derivatives and Local Approximation
      • Integrals and Accumulation
      • Sequences and Series
      • Taylor Expansion
    • Multivariable Calculus
      • Multivariable Calculus
      • Partial Derivatives and Gradients
      • Chain Rule and Linearization
      • Jacobians and Hessians
      • Multiple Integrals
      • Constrained Optimization
      • Vector Fields and Divergence / Curl
    • Linear Algebra
      • Linear Algebra
      • Vectors and Linear Combinations
      • Matrices and Linear Maps
      • Subspaces, Basis, and Dimension
      • Orthogonality and Least Squares
      • Eigenvalues and Diagonalization
      • SVD and Low-Rank Approximation
    • Probability
      • Probability
      • Sample Spaces, Events, and Conditioning
      • Random Variables and Distributions
      • Expectation, Variance, Covariance
      • Joint, Conditional, and Bayes
      • Law of Large Numbers and CLT
      • Concentration and Common Inequalities
    • Statistics
      • Statistics
      • Descriptive Statistics and Data Models
      • Estimation and Bias-Variance
      • Maximum Likelihood and Bayesian Basics
      • Confidence Intervals and Hypothesis Testing
      • Regression and Classification Basics
      • Experimental Design and Model Evaluation
    • Advanced Topics
    • Real Analysis
      • Real Analysis
      • Rigorous Convergence
      • Continuity, Compactness, and Completeness
      • Sequences and Series of Functions
      • Differentiation and Integration as Theorems
      • Fixed-Point, Implicit, and Inverse Function Ideas
    • Optimization
      • Optimization
      • Convex Sets and Separation
      • Convex Functions and Subgradients
      • Constrained Optimization, KKT, and Lagrangians
      • Unconstrained First-Order Methods
      • Duality and Certificates
    • Numerical Methods
      • Numerical Methods
      • Floating-Point, Conditioning, and Backward Error
      • Numerical Linear Systems and Factorizations
      • Iterative Methods and Preconditioning
      • Numerical Least Squares and Regularization
      • Eigenvalue and SVD Computation
      • Approximation, Differentiation, Integration, and Error Control
      • Time-Stepping for ODEs and Stability
    • Signal Processing and Estimation
      • Signal Processing and Estimation
      • Signals, Convolution, and Linear Time-Invariant Systems
      • Fourier Analysis, Frequency Response, and Spectral Views
      • Sampling, Aliasing, and Reconstruction
      • Noise Models, Wiener Filtering, and MMSE Estimation
      • State Estimation, Smoothing, and Hidden-State Inference
      • Inverse Problems, Deconvolution, and Regularized Recovery
      • Signal Processing Bridges to Communication, Sensing, and Modern ML
    • ODEs and Dynamical Systems
      • ODEs and Dynamical Systems
      • First-Order ODEs, Existence, and Solution Curves
      • Second-Order Systems, State Variables, and Reduction to First Order
      • Linear Systems, Matrix Exponentials, and Modes
      • Phase Portraits, Equilibria, and Local Stability
      • Lyapunov Functions, Invariant Sets, and Long-Time Behavior
      • Discretization, Time-Stepping, and the Bridge to Control
      • Research Bridges: Reverse-Time SDEs, Probability-Flow ODEs, Flow Matching, and Control
    • Control and Dynamics
      • Control and Dynamics
      • State-Space Models, Inputs, and Outputs
      • Controllability, Reachability, and Observability
      • Feedback, Stability, and Pole Placement
      • Linear Quadratic Regulation and Riccati Intuition
      • Estimation, Kalman Filtering, and the Separation Principle
      • Model Predictive Control and Constraint Handling
      • Learning-Based Control, System Identification, and RL Bridges
    • Matrix Analysis
      • Matrix Analysis
      • Norms and Operator Norms
      • Positive Semidefinite Matrices and Quadratic Forms
      • Spectral Inequalities and Variational Principles
      • Perturbation and Stability
      • Trace, Determinant, and Matrix Functions
    • Learning Theory
      • Learning Theory
      • ERM, Population Risk, and Hypothesis Classes
      • PAC Learning, Sample Complexity, and the Learning Setup
      • VC Dimension and Shattering
      • Uniform Convergence and Generalization Bounds
      • Rademacher Complexity and Data-Dependent Capacity
      • Algorithmic Stability and Regularization
      • Generalization in Modern Regimes
    • High-Dimensional Probability
      • High-Dimensional Probability
      • Concentration Beyond Basics
      • Sub-Gaussian and Sub-Exponential Variables
      • Random Vectors, Isotropy, and Norms
      • Random Matrices and Spectral Concentration
      • High-Dimensional Phenomena
      • High-Dimensional Probability for Learning Theory and Modern ML
    • High-Dimensional Statistics
      • High-Dimensional Statistics
      • Sparsity and Regularization
      • Lasso and Compressed Sensing Basics
      • Design Geometry: Restricted Eigenvalues, Coherence, and RIP
      • High-Dimensional Regression
      • Covariance, PCA, and Spectral Estimation in High Dimension
      • Minimax and Lower Bounds
      • Inference in High Dimension
    • Information Theory
      • Information Theory
      • Entropy, Cross-Entropy, and KL Divergence
      • Mutual Information, Conditional Entropy, and Data Processing
      • Typicality, Source Coding, and Compression Intuition
      • Channel Coding, Capacity, and Converse Proofs
      • Rate-Distortion and Representation Tradeoffs
      • Variational Objectives, ELBO, and Information Bounds
      • Information-Theoretic Lower Bounds in Statistics, Learning, and Communication
    • Stochastic Processes
      • Stochastic Processes
      • Markov Chains and Stationary Distributions
      • Martingales and Optional Stopping Intuition
      • Poisson Processes and Counting Models
      • Brownian Motion and Diffusion Intuition
      • SDEs and Ito Intuition
      • Mixing, Ergodicity, and MCMC Bridges
    • Stochastic Control and Dynamic Programming
      • Stochastic Control and Dynamic Programming
      • Controlled Markov Models, Policies, and Cost Functionals
      • Finite-Horizon Dynamic Programming and Backward Induction
      • Infinite-Horizon Value Functions, Bellman Equations, and Contractions
      • Value Iteration, Policy Iteration, and Approximate Dynamic Programming
      • Stochastic Linear Systems, LQG, and the Separation Principle
      • Continuous-Time Stochastic Control and Hamilton-Jacobi-Bellman Intuition
      • Partial Observability, Belief States, and RL/Control Bridges
  • Applications
    • Applications
    • Machine Learning
      • Machine Learning
      • Core Route
        • Supervised Learning, Losses, and Empirical Risk
        • Optimization for Machine Learning
        • Generalization, Overfitting, and Validation
        • Regularization, Implicit Bias, and Model Complexity
        • Uncertainty Calibration and Predictive Confidence
      • Model Components
        • Representation Learning and Geometry of Embeddings
        • Linear Probes and Representation Diagnostics
        • Backpropagation and Computation Graphs
        • Attention, Softmax, and Weighted Mixtures
        • In-Context Learning and Linearization
      • Graphs
        • Graph Diffusion and Message Passing
        • Oversmoothing, Depth, and Graph Sampling
        • Graph Rewiring, Homophily, and Heterophily
        • Long-Range Dependence and Oversquashing in Graphs
      • Kernels and Surrogates
        • Kernel Methods and Similarity Geometry
        • Kernel Ridge and Gaussian-Process Intuition
        • Bayesian Optimization and Surrogate Modeling
      • Generative Models
        • Diffusion Models and Denoising
        • Score Matching and the SDE View of Diffusion
        • Flow Matching and Transport Views of Generation
    • Control and Dynamics
      • Control and Dynamics
      • State, Sensing, and Actuation
      • Feedback and Stability in Real Systems
      • Estimation under Noise
      • Optimal Control and Trajectory Planning
      • Constraints, MPC, and Safe Operation
      • Learning, Identification, and RL Bridges
    • Optimization and Inference
      • Optimization and Inference
      • Measurements, Models, and Hidden Variables
      • Likelihoods, Priors, and MAP Estimation
      • Filtering, Smoothing, and Hidden-State Inference
      • Variational Inference, ELBO, and Tractable Approximation
      • Sampling, Mixing, and MCMC for Inference
      • Bayesian Optimization, Active Sensing, and Information Gathering
    • Scientific Computing
      • Scientific Computing
      • Models, Discretization, and Simulation Loops
      • Time-Stepping, Stiffness, and Solver Choice
      • Linear Systems, Conditioning, and Stable Computation
      • Approximation, Quadrature, and Error Control in Practice
      • Inverse Problems, Parameter Estimation, and Data Assimilation
      • Scientific ML, Surrogates, and Computation-Physics Bridges
    • Signal and Communication
      • Signal and Communication
      • Signals, Channels, and Noisy Measurements
      • Filtering, Denoising, and Estimation in Communication Systems
      • Sampling, Bandwidth, and Reconstruction in Practice
      • Detection, Decoding, and Error Tradeoffs
      • Inverse Problems, Sensing, and Reconstruction
      • Modern Bridges: Representation Learning, Sensing, and Communication
  • Paper Lab
    • Paper Lab
    • How to Read a Paper
    • Notation Translation
    • Theorem Decoder
    • Dependency Maps
    • Reading Trails
  • Research
    • Research
    • Theorem Families
    • Directions
    • Surveys
    • Venues
  • Publication
    • Publication
    • Venue Map
    • How Top-Venue Papers Are Shaped
    • Claim-Evidence Matrix
    • Related Work and Positioning
    • Theorem-to-Experiment Alignment
    • Writing Theory Sections
    • Writing Experiment Sections
    • Reproducibility Checklist
    • Review and Rebuttal
  • Library
    • Library
    • Books
    • Courses
    • Notes
    • Papers
  • Notes
    • Notes
    • When a Topic Starts to Feel Reusable
  1. Library
  2. Library

Library

What The Library Is For

The library is a support layer.

It should help readers answer:

  • which resource is good for a first pass
  • which one is better for a second pass
  • which one is best used while reading papers

The site itself stays topic-first. The library helps readers choose references once they know what they are trying to learn.

Library Guides

Books

Best when you still need the big picture and a slower first or second pass.

Open Books

Courses

Best when you want pacing, lecture order, and worked examples from a coherent sequence.

Open Courses

Notes

Best when you already know the topic name and want a compact technical refresher.

Open Library Notes

Papers

Best when you want the highest-resolution current form of a result, method, or theorem family.

Open Papers

A Good First Route

If you want the shortest route through this section, use this order:

  1. Books
  2. Courses
  3. Notes
  4. Papers

This order usually works because:

  • books help when you still need the big picture
  • courses help when you need pacing and worked examples
  • notes help when you already know the topic and want a compact refresher
  • papers help when you want the most current or highest-resolution version of a claim

One Important Distinction

Library > Notes is a guide to external reference notes.

The top-level Notes section is where the site’s own essays and reflective commentary live.

Back to top
Review and Rebuttal
Books