Optimization and Inference
inference, optimization, latent-variables, posterior, applications
1 Why This Section Exists
Many readers can follow the individual math topics, but still do not have a clean picture of what an inference problem actually is.
This hub is for the moment when you want to answer questions like:
- what exactly is hidden, observed, or only partially measured?
- where do likelihoods, priors, regularizers, and posteriors all come from?
- when should a problem be solved by optimization, sequential estimation, variational approximation, or sampling?
The rule for this section is simple:
every inference page should point back to the exact observed data, hidden variables, and approximation choices it uses
2 What Optimization And Inference Keeps Reusing
Across inverse problems, Bayesian estimation, latent-variable modeling, filtering, and active sensing, the same mathematical objects keep returning:
- observed data or measurements
- hidden states, parameters, or latent variables
- forward models and likelihoods
- priors, regularizers, or structural assumptions
- posterior questions, uncertainty summaries, or approximate surrogates
If you can identify those objects quickly, papers about inference stop feeling like separate dialects.
3 Start Here By Interest
3.1 If You Want The Shortest Math-to-Inference Entry
Start in this order:
3.3 If You Care Most About Approximation And Uncertainty
Start with:
4 First-Pass Route
The cleanest live first-pass route in this section right now is:
- Measurements, Models, and Hidden Variables
- Likelihoods, Priors, and MAP Estimation
- Filtering, Smoothing, and Hidden-State Inference
- Variational Inference, ELBO, and Tractable Approximation
- Sampling, Mixing, and MCMC for Inference
- Bayesian Optimization, Active Sensing, and Information Gathering
Use this hub when you want the shortest translation from pure math into the recurring question:
what hidden quantity am I trying to infer from incomplete or noisy observations, and what computational strategy should I trust?
5 How To Use This Section
- Use
Topicswhen you want the math itself. - Use
Applications > Optimization and Inferencewhen you want the translation layer from models and objectives into real inference tasks. - Use Paper Lab when the inference objects feel clear and you want paper-reading practice.
6 Sources and Further Reading
- 6.011 / Signals, Systems and Inference -
First pass- official MIT course showing how observation models and inference questions arise together. Checked2026-04-26. - 16.322 / Stochastic Estimation and Control -
First pass- official MIT anchor for hidden-state estimation and model-based inference. Checked2026-04-26. - EE278 / Introduction to Statistical Signal Processing -
Second pass- official Stanford course anchor for estimation, filtering, and inference from noisy observations. Checked2026-04-26. - STATS 305B / Applied Statistics II -
Second pass- official Stanford anchor for likelihoods, regularization, and modern statistical estimation. Checked2026-04-26. - CS238 / Decision Making Under Uncertainty -
Bridge to sequential inference- useful once inference and planning begin to overlap. Checked2026-04-26.