This preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: 8 pts for correctly deriving recurrence relation to be O ( n * log ( n )) T ( m ) = 2 T ( m/ 2) + m T ( m/ 2) = 2 T ( m/ 4) + m/ 2 ... ...... T (2) = 2 T (1) + 2 Multiplying the i th equation by 2 i , you get: T ( m ) = 2 T ( m/ 2) + m 2 * T ( m/ 2) = (2 T ( m/ 4) + m/ 2) * 2 ... ...... 2 log ( m ) * T (2) = (2 T (1) + m/ 2 log ( m ) ) * 2 log ( m ) Summing both sides up and canceling, you get: T ( m ) = m + m + m + .... + m for a total of log ( m ) times since there are log ( m ) equations Therefore, you end up with T ( m ) = m * log ( m ) 2 pts if you show that you understand the concept of ”adding” up these parts: O (1) + O ( n 2 ) + O (1) + O ( n * log ( n )) 3 pts if you get = O ( n 2 ) Extra Credit 1: 15 points Consider a function T ( n ) that maps nonnegative real numbers to positive real numbers and satisfies the recurrence relation T ( n ) = T ( √ n ) + 1 for all n ≥ 2 and T (2) = 1 Find an elementary function f ( n ) such that T ( n ) is Θ( f ( n )). (For our purposes, an “elementary” function is a function of one real variable built from a finite number of exponentials, logarithms, constants, and roots through composition and combinations using the four elementary operations addition, multiplication, subtraction, division.) Answer As we have already done with such recurrences, we will assume that n is of a special form, n = 2 2 k . (Below we justify this and other such assumptions and discuss a bit when we can make them.) We use this particular form because for such n we have √ n = 2 2 k 1 . This allows us to consider another function, S ( k ) = T (2 2 k ) and use the recurrence relation for T to state an easier to solve recurrence relation for S : S ( k ) = S ( k 1) + 1 for all k ≥ 1 and S (0) = 1 (we are only using the recurrence relation for T for n ≥ 4) This is easy to solve, with the same technique we have been using several times. Add up the sides of 6 the following equalities: S ( k ) = S ( k 1) + 1 ··· = ··· S (1) = S (0) + 1 After crossing out the terms that appear on both sides we get S ( k ) = S (0)+ k therefore S ( k ) = k +1. So now T (2 2 k ) = S ( k ) = k + 1 therefore, for n a double power of 2 we have T ( n ) = log log n + 1. From this we can conclude that T ( n ) is Θ(log log n ). Why can we conclude this? Well, for any positive real number n ≥ 2 let k = b log log n c , then 2 2 k ≤ n < 2 2 k +1 If T is a monotonically increasing (not necessarily strictly so) function then S ( k ) = T (2 2 k ) ≤ T ( n ) ≤ T (2 2 k +1 ) ≤ S ( k + 1), thus k + 1 ≤ T ( n ) ≤ k + 2 therefore b log log n c + 1 ≤ T ( n ) ≤ b log log n c + 2 hence T ( n ) is Θ(log log n ). If T is not monotonically increasing then some other reasonable conditions can be put on T to get this behavior, like a bound on how much the function varies between certain bounds. But, warning , there are pathological functions that, for example, oscilate wildly on small intervals for which this kind of reasoning will be unjustified. Such functions do not arise naturally in algorithmic complexitykind of reasoning will be unjustified....
View
Full Document
 Spring '09
 TANNEN
 Algorithms, Data Structures, Recursion, Sort, Big O notation, Negative and nonnegative numbers, running time, Recurrence relation

Click to edit the document details