Sunday, August 4, 2013

Master Method for Solving Recurrences

Why Master method ? - Master method helps us analysing algorithms with their running time.

So, lets consider the integer multiplication - here.

Now we get equation 1.
The way we reason about recursive algorithm like these is with the help of recurrence. 

Recurrences
What is recurrence?
T(n) = max number of operations this algorithm needs to multiply 2 n digit numbers. Recurrence is to express T(n) in terms of recursive calls.

Ingredient of recurrence
There are 2 ingredients of recurrences - base case and normal recurrence.
So, for integer multiplication, we have following:

  1. Base case : T(1) <= constant.
  2. For all n>1 i.e. general case : T( n )  <= 4 T ( n /2 ) + O ( n )

Here we have 4 times T(n/2) as 4 is the number of products we have to calculate, and n/2 is the size of recurrence.

Algorithm 2 : 3 products rather than 4 for integer multiplication
New recurrence:
Base case : T(1)
General recurrence – 3 T( n/2)  + O ( n )

Recurrence of merge sort - 2T (n / 2)

Master method

Formal definition of master method

What makes the master method so awesome is that it can be thought of as a "black box" for solving recurrences. This means that you give it some inputs and it tell tell you the running time of the algorithm.

Assumption – Master method assumes that the size of the sub problem is same.
So, it only works when all subproblems have the same size. For example in each recursive call to MergeSort, each recursive call does work on size n/2. Though there are other more generic versions of master method.

Base Case:
T(n) ≤ a constant c, for all sufficiently small n
For larger n i.e. General case:
T(n) ≤ aT(n/b) + O(nd) 
  1. a is the number of recursive calls
  2. b is the factor by which the problem size shrinks. It must be greater than one or the problem will never get any smaller
  3. d is the exponent of the combining time. It can be as small as 0, which indicates it takes linear time.
a, b, d must all be constants and independent of the input size.

So, given such a recurrence, we will have upper bound on the recurrence, and via master method it an is:

Notice the logarithm
In the first case, the base for the logarithm doesn't matter because a base will only change the amount by a constant factor. i.e. log2a = constant * logea.
However, in the third case, the base matters a lot because it could make the difference between linear and quadratic time.

Examples
Example 1 - Merge Sort
Let's look at MergeSort as an example. We know T(n) = 2T(n/2) + O(n).
We have a=2, b=2, and d=1. a=bd so we are in case 1. The running time will be O(ndlogn) which is just O(nlogn).

Example 2 - Binary Search
For binary search, we split the problem into two with n/2 input size each BUT we only work in one of them and just compare the element you're searching for with the center element, which takes constant time.
T(n) = 2T (n/2) + O(1)
As, we have 2 recursive calls, and problem size of n/2. Also, when we find the element, we have to just compare the value i.e. O(1) time for comparing.
Thus, we have a=1, b=2, d=0 and a=bd , so it becomes case 1, so the overall runtime will be O(logn).

Example 3 - Integer multiplication
first case (naive algorithm)
a=4, b = 2, d=1
T(n) = 4T(n/2) + O (n)
a > bd , so it is case 3. Hence, we have O(n log24), i.e. O(n2)

2nd case (gauss's method)
T(n) = 3T(n/2)+O(n)
a = 3, b = 2, d = 1
a> bd i.e. case 3 again. Hence we have O(n log23) i.e. O(n1.59)

Example 4 - Matrix multiplication using Strassen's method
T(n) = 7T(n/2) + O(n2)
a = 7
b = 2
d = 2
a > bd , which implies case 3, so T(n) = O(n log27) = O(n2.81)

Example 5 - Fictious recurrence (just to to cover case 2)
T(n) = 2T(n/2)+ O(n2)
a = 2, b = 2, d = 2
a<bd , so it is case 2. Time complexity is O(nd) = O(n2)

0 comments:

Post a Comment