Skip to content

Advanced -> Gradient Descent

Theory-breadth smooth optimization taught first through fixed-step batch gradient descent on one-feature linear regression with squared loss.

  • Topic slug: advanced/gradient-descent
  • Tutorial page: Open tutorial
  • Ladder page: Open ladder
  • Repo problems currently tagged here: 1
  • Repo companion pages: 4
  • Curated external problems: 2

Microtopics

  • gradient-descent
  • batch-gradient
  • squared-loss
  • linear-regression
  • learning-rate
  • epochs
  • smooth-optimization

Learning Sources

Source Type
Stanford CS229 notes on linear regression Course
MIT Open Learning Library gradient descent chapter Course
MIT nonlinear optimization notes on gradient descent Course

Repo Companion Material

Material Type
Gradient Descent hot sheet quick reference
Linear Regression Gradient Descent Benchmark flagship note
Machine Learning Algorithms tutorial compare point
Template Library exact starter route starter route

Curated External Problems

Core

Problem Source Difficulty Context Style Prerequisites Tags Why it fits
Linear Regression Gradient Descent Benchmark Stanford CS229 Theory Smooth Optimization Theory Benchmark; Deterministic Update Rule Affine Models; Derivatives; Basic Algebra Linear Regression; Squared Loss; Learning Rate The cleanest first gradient-descent benchmark for this repo because one affine model and one smooth loss make the update rule completely explicit without pretending to cover modern optimizer families.

Stretch

Problem Source Difficulty Context Style Prerequisites Tags Why it fits
Gradient descent MIT Open Learning Library Theory Further Theory Course Notes; Theory Breadth Gradient; Learning Rate Convexity; Step Size; Convergence; Optimization A good follow-up once the repo benchmark is clear and you want broader smooth-optimization intuition without widening the lane into full numerical optimization coverage.

Repo Problems

Code Title Fit Difficulty Pattern Note Solution
LINEARREGRESSIONGD Linear Regression Gradient Descent Benchmark primary medium - Note Code

Regeneration

python3 scripts/generate_problem_catalog.py