Recurrences and Asymptotics

How recursively defined quantities, growth rates, and asymptotic notation organize discrete structure and algorithmic scaling.
Modified

April 26, 2026

Keywords

recurrence, asymptotics, big-O, master theorem, induction

1 Role

This page is the second step of the discrete-math module.

It takes the finite-structure viewpoint from counting and turns it into a language for growth:

  • sequences defined from earlier terms
  • recursive counting patterns
  • runtime-style recurrences
  • asymptotic comparison between functions

2 First-Pass Promise

Read this page after Counting and Combinatorics.

If you stop here, you should still understand:

  • what a recurrence is
  • why base cases matter
  • what O, Theta, and Omega are trying to summarize
  • how simple recurrences translate into growth rates
  • why this page is a bridge toward algorithms and later numerical or probabilistic scaling arguments

3 Why It Matters

Many discrete objects are not easiest to describe directly.

Instead, they are defined by:

  • how the next value depends on earlier ones
  • how a problem splits into smaller subproblems
  • how a counting process grows as size increases

That is the world of recurrences.

Asymptotics then answers a different but closely related question:

once a formula or recurrence is known, which growth feature actually matters at large scale?

Together these ideas show up in:

  • divide-and-conquer algorithms
  • graph and tree recurrences
  • dynamic programming states
  • combinatorial sequences
  • complexity comparisons

4 Prerequisite Recall

  • Counting and Combinatorics already trained the habit of describing objects by structured choices
  • Induction is the proof pattern that naturally matches recursively defined sequences
  • growth comparison is often about dominant terms, not exact constants

5 Intuition

A recurrence is a rule that says:

to know the size at step n, look at smaller steps first

For example:

\[ a_n = a_{n-1} + 3 \]

means each term is obtained by adding the same amount to the previous one.

More complicated recurrences encode:

  • branching processes
  • divide-and-conquer structure
  • tree growth
  • repeated halving or splitting

Asymptotic notation is what we use after that, when exact formulas become less important than growth type.

For example, these functions do not grow in the same way:

  • \(n\)
  • \(n \log n\)
  • \(n^2\)
  • 2^n

Even if constants or lower-order terms differ, the dominant growth pattern controls the large-scale story.

6 Formal Core

Definition 1 (Definition: Recurrence) A recurrence relation defines a sequence by expressing each term in terms of earlier terms.

A simple example is

\[ a_n = a_{n-1} + 3 \quad \text{for } n \ge 1, \]

together with a base case such as

\[ a_0 = 2. \]

Without the base case, the rule alone does not determine the sequence.

Definition 2 (Definition: Asymptotic Notation) For functions \(f\) and \(g\) on the positive integers:

  • \(f(n) = O(g(n))\) means \(f\) is eventually bounded above by a constant multiple of \(g\)
  • \(f(n) = \Omega(g(n))\) means \(f\) is eventually bounded below by a constant multiple of \(g\)
  • \(f(n) = \Theta(g(n))\) means both bounds hold, so \(f\) and \(g\) have the same asymptotic growth rate up to constants

The point is to ignore bounded-scale details and keep the large-\(n\) growth pattern.

Theorem 1 (Theorem Idea: Repeated Expansion For Simple Recurrences) If

\[ a_n = a_{n-1} + c \]

with base case \(a_0 = b\), then repeated substitution gives

\[ a_n = b + cn. \]

So the recurrence defines a linearly growing sequence, which means

\[ a_n = \Theta(n) \]

when \(c \neq 0\).

Theorem 2 (Theorem Idea: Divide-and-Conquer Growth) Recurrences of the form

\[ T(n) = aT(n/b) + f(n) \]

encode a common algorithmic pattern:

  • split into \(a\) subproblems
  • each subproblem has size about \(n/b\)
  • pay additional nonrecursive work \(f(n)\)

The large-scale growth depends on the balance between:

  • the branching term \(aT(n/b)\)
  • the extra work \(f(n)\)

This is the intuition behind master-theorem style reasoning.

Theorem 3 (Theorem Idea: Dominant-Term Comparison) For asymptotic growth, the largest-order term eventually dominates lower-order terms and constant factors.

For example,

\[ 7n^2 + 3n + 10 = \Theta(n^2). \]

This is why asymptotic notation compresses formulas without erasing the growth behavior that matters most.

7 Worked Example

Consider the recurrence

\[ T(n) = 2T(n/2) + n \]

for powers of two, with base case

\[ T(1) = 1. \]

This is the basic divide-and-conquer pattern:

  • split into two subproblems of half size
  • do additional linear work

7.1 Step 1: Expand once

\[ T(n) = 2\left(2T(n/4) + n/2\right) + n = 4T(n/4) + 2n. \]

7.2 Step 2: Expand again

\[ T(n) = 4\left(2T(n/8) + n/4\right) + 2n = 8T(n/8) + 3n. \]

A pattern appears: after \(k\) expansions,

\[ T(n) = 2^k T(n/2^k) + kn. \]

7.3 Step 3: Stop at the base case

Set \(n/2^k = 1\), so \(2^k = n\) and \(k = \log_2 n\).

Then

\[ T(n) = nT(1) + n\log_2 n = n + n\log_2 n. \]

So

\[ T(n) = \Theta(n \log n). \]

This example matters because it shows both ideas at once:

  • a recurrence captures recursive structure
  • asymptotic notation extracts the real growth story

8 Computation Lens

For first-pass recurrence work, the main tactics are:

  1. compute the first few terms
  2. expand the recurrence repeatedly
  3. look for a stable pattern
  4. guess the closed form or asymptotic rate
  5. check it, often by induction

For asymptotics, the main habit is:

  1. identify the dominant term
  2. ignore lower-order terms and constant factors only after the dominant structure is clear
  3. keep track of whether you need an upper bound, lower bound, or tight bound

9 Application Lens

This page is one of the main bridges from foundation math into algorithms.

A recurrence often appears when an algorithm:

  • splits a problem recursively
  • scans data at each level
  • recombines partial answers

Then asymptotic notation tells you whether the growth is:

  • linear
  • near-linear
  • quadratic
  • exponential

The exact same habits later reappear in:

  • graph algorithms
  • dynamic programming analyses
  • numerical iteration counts
  • probabilistic scaling arguments

10 Stop Here For First Pass

If you can now explain:

  • why recurrences need base cases
  • how repeated expansion works
  • what O, Omega, and Theta are tracking
  • why \(n \log n\) and \(n^2\) are genuinely different growth classes

then this page has done its job.

11 Go Deeper

The next pages in this module are:

The strongest adjacent bridges are:

12 Optional Deeper Reading After First Pass

13 Sources and Further Reading

Sources checked online on 2026-04-25:

  • MIT 6.1200J lecture 06
  • MIT 6.1200J lecture 07
  • MIT 6.1200J course page
  • MIT 6.042J course page
  • CMU 15-151 course page
Back to top