When analyzing algorithms, recall that we only care about asymptotic behavior. Recursive algorithms are no different. Rather than solve exactly the recurrence relation associated with the cost of an algorithm, it is enough to give an asymptotic characterization. The Master's method is the most useful and easy method to compute the time complexity function of recurrence relations. The approach was first presented by Jon Bentley, Dorothea Haken, and James B. Saxe in 1980, where it was described as a "unifying method" for solving such recurrences. The name "master theorem" was popularized by the widely used algorithms textbook Introduction to Algorithms by Cormen, Leiserson, Rivest, and Stein.
Algorithms are usually grouped into different types, some examples include greedy algorithms, recursive algorithms, dynamic programming, divide and conquer, etc. The master theorem is a method used to provide asymptotic analysis of recurrence relations that occur in many divide and conquer algorithms. A divide and conquer algorithm is an algorithm that solves a problem by breaking it up into smaller sub-problems first, then solves each subproblem individually before combining the results into the solution for the main larger problem.
There are three methods used to analyze recurrence relations:
A recurrence relation is an equation that defines a sequence based on a rule that gives the next term as a function of the previous term(s) It’s a lot like recursive code:
- At least one base case and at least one recursive case
- Each case should include the values for n to which it corresponds
- The recursive case should reduce the input size in a way that eventually triggers the base case
- The cases of your recurrence usually correspond exactly to the cases of the code
• The time complexity of the algorithm is represented in the form of recurrence relation.
• When analyzing algorithms, recall that we only care about asymptotic behavior.
• Rather than solving exactly the recurrence relation associated with the cost of an algorithm, it is sufficient to give an asymptotic characterization.
Oh, and Omega, and Theta
• Big oh notation ( O ) is used to describe asymptotic upper bound.
- My code takes at most this long to run
• Big-Omega (Ω) is an asymptotic lower bound.
- My code takes at least this long to run
• Big Theta (Θ) is “equal to”
- My code takes “exactly”* this long to run
- *Except for constant factors and lower order terms
- Only exists when Big-Oh == Big-Omega!
• In industry, people often use Big-Oh to mean “Tight Big-Oh” and use it even when a Big-Theta exists.
Name | Big O |
---|---|
Constant | O(c) |
Linear | O(n) |
Quadratic | O(n2) |
Cubic | O(n3) |
Exponential | O(2n) |
Logarithmic | O(log(n)) |
Log Linear | O(nlog(n)) |
Master's Method for Dividing Functions
The master theorem always yields asymptotically tight bounds to recurrences from divide and conquer algorithms that partition an input into smaller subproblems of equal sizes, solve the subproblems recursively, and then combine the subproblem solutions to give a solution to the original problem. The time for such an algorithm can be expressed by adding the work that they perform at the top level of their recursion (to divide the problems into subproblems and then combine the subproblem solutions) together with the time made in the recursive calls of the algorithm. If T(n) denotes the total time for the algorithm on an input of size n, and f(n) denotes the amount of time taken at the top level of the recurrence then the time can be expressed by a recurrence relation that takes the form:
Used to directly calculate the time complexity function of 'dividing' recurrence relations of the form:
• T(n) = aT(n/b) + f(n)
• where f(n) = θ(nk logp n)
• Compare logba and k to decide the final time complexity function.
a
represents the number of subproblems in the recursion, and it must be greater or equal to one (a >= 1
).
n/b
is assumed to have the same size for each subproblem, and b
must be greater than one (b > 1
) to ensure a proper divide and conquer recurrence analysis.
Master's Theorem states that:
To calculate logba and compare it with f(n) to decide the final time complexity of the recurrence relation T(n).
The idea behind Master's algorithm is based on the computation of nlogba and comparing it with f(n). The time complexity function comes out to be the one overriding the other.
where:
n = input size (or the size of the problem)
a = count of subproblems in the decreasing recursive function
n - b = size of each subproblem (Assuming size of each subproblem is same)
f(n) is asymptotically positive. (Asymptotically positive means that the function is positive for all sufficiently large n.)
The theorem is as follows: If T(n) = a T(n-b) + f(n),
where a ≥ 1, b > 0, & f(n) = O(nk ), and k ≥ 0
Case 1: if a = 1,
T(n) = O (n * f(n)) or O (nk+1)
1) T(n) = T(n – 1 ) + 1 O(n)
2) T(n) = T(n – 1 ) + n O(n2 )
3) T(n) = T(n – 1 ) + log n O(n log n )
Case 2: if a > 1,
T(n) = O (an/b * f(n)) or O (an/b * nk )
1) T(n) = 2T(n – 1 ) + 1 O(2n )
2) T(n) = 3T(n – 1 ) + 1 O(3n )
3) T(n) = 2T(n – 1 ) + n O(n 2n )
Case 3: if a < 1,
T(n) = O (f(n)) or O (nk )
We cannot apply the master’s theorem in the following cases:
T(n)
is not monotone. eg. T(n) = sin(n) f(n)
is not a polynomial. eg. f(n) = 2n a
is not a constant. eg. a = 2n
Merge Sort: T(n) = 2T(n/2) + Θ(n). It falls in case 2 as c is 1 and Logba is also 1.
So the solution is Θ(n Logn) .
Binary Search: T(n) = T(n/2) + Θ(1). It also falls in case 2 as c is 0 and Logba is also 0.
So the solution is Θ(Logn) .
Akra–Bazzi Theorem
The Akra–Bazzi method, or Akra–Bazzi theorem, is used to analyze the asymptotic behavior of the mathematical recurrences that appear in the analysis of divide and conquer algorithms where the sub-problems have substantially different sizes. It is a generalization of the master theorem for divide-and-conquer recurrences, which assumes that the sub-problems have equal size.
The recurrence relation form limits the usability of the Master’s theorem. Following are three recurrences that cannot be solved directly using the master’s theorem:
The Akra-Bazzi method can be applied to the recurrences of the following form:
![]()
where,
and
are constants such that:
![]()
![]()
![]()
Next, find p such that
![]()
Then
![]()
Example 1.
a1 = 3
b1 =
![]()
a2 = 2
b2 =
![]()
b1 and b2 are in the range (0, 1)
g(n) = θ(n) which is O(nc), here c can be 1.
In this problem h1(n) and h2(n) are not present.
Here p=1 satisfies
Finally,
Example 2.
a =
![]()
b =
![]()
g(n) =
![]()
b is in the range (0, 1)
g(n) = θ(n^2) which is in O(nc), here c can be 1.
In this problem h(n) is not present.
Here p= – 1 satisfies
Finally,
Example 3.
a = 9
b =
![]()
g(n) = θ(n)
![]()
b is in the range(0, 1)
g(n) =
which is O(nc), here c can be 1.
h(n) =
which is
![]()
Here p=2 satisfies
Finally,
Advantages:
Disadvantages: