AlgoMaster Logo

Master Theorem

Ashish

Ashish Pratap Singh

When you study recursive algorithms like Merge Sort, Binary Search, or Quick Sort, you’ll often end up writing a recurrence relation like:

T(n) = 2T(n/2) + n

At this point, most beginners get stuck: “How do I find the time complexity from this?”

You could expand it manually… or use recursion trees…

But there’s a faster way.

That’s where the Master Theorem comes in — a direct formula to solve most divide-and-conquer recurrences in seconds.

What is Master Theorem?

The Master Theorem applies to recurrences of the form:

T(n) = aT(n/b) + f(n)

Where:

  • a: Number of subproblems the problem is divided into
  • n/b: Size of each subproblem
  • f(n): Cost of dividing, merging, or combining the results

It helps us find the asymptotic time complexity (Big O) directly without manually expanding the recurrence.

The Three Cases

We analyze recurrences of the form:

Define the benchmark work from the recursion tree (ignoring f):

Now compare f(n) with:

Case 1 — Subproblem Work Dominates

Condition:

This means, the non-recursive work f(n) grows strictly slower than the work contributed by the subproblems.

Result:

Intuition: Most of the time is spent inside recursion; the combine work is negligible by comparison.

Example:

Here: a = 8, b = 2 → log₂8 = 3

Case 2 — Both Parts Balance

Condition

That means both recursive and combining work grow equally fast.

Result:

Intuition: Each level of recursion does the same amount of work → multiply by log n levels.

Example:

Here:

  • a = 2, b = 2 → log₂2 = 1
  • f(n) = n = Θ(n¹) → same order: T(n) = Θ(n log n)

Case 3 — Outside Work Dominates

If

and if a × f(n/b) ≤ c × f(n) for some constant c < 1 and sufficiently large n (called the regularity condition),

then:

Intuition: The combine step f(n) is doing so much work that the recursive calls become insignificant.

Example:

Here:

  • a = 2, b = 2 → log₂2 = 1

Step-by-Step Examples

a = 1, b = 2, f(n) = 1

Compute:

→ f(n) = Θ(n⁰) → Case 2: T(n) = Θ(log n)

Example 2: Merge Sort

a = 2, b = 2, f(n) = n

Compute:

→ f(n) = Θ(n¹) → Case 2: T(n) = Θ(n log n)

Example 3: Tower of Hanoi

Not in Master Theorem form because subproblem size is (n−1) not (n/b).

Master Theorem not applicable

We’d need expansion or recursion tree methods.

Example 4: Strassen’s Matrix Multiplication

Here:

  • a = 7, b = 2 → log₂7 ≈ 2.81
  • f(n) = n²

This is how Strassen’s algorithm beats the classic O(n³) approach.

Example 5: Weird Recurrence

  • a = 9, b = 3 → log₃9 = 2
  • f(n) = n → smaller power → Case 1: T(n) = Θ(n²)

When Master Theorem Fails

The master theorem is powerful but not universal.

You cannot use it when:

  • Uneven subproblem sizes: T(n) = T(n/2) + T(n/3) + n
  • Additive constants instead of fractions: T(n) = T(n−1) + n
  • f(n) not positive / irregular: T(n) = 2T(n/2) − n
  • Non-polynomial f(n): T(n) = 2T(n/2) + log n

In such cases, use recursion tree or substitution method.