CSE_548-AnalysisOfAlgorithm-fall2010

# CSE_548-AnalysisOfAlgorithm-fall2010 - Michael A. Bender...

This preview shows pages 1–4. Sign up to view the full content.

Michael A. Bender CSE 548 – Analysis of Algorithms, Fall 2010 Assignment #1 Tuesday, October 5, 2010 Problem 1 For the following C codes, ﬁnd the time-complexity by ﬁrst ﬁnding the recurrence relation and then simplifying the recurrence. (1) int blah(int n) { int sum = 0; int i, j; if (n == 0) return 1; for(i=0; i < =n-1; i++) for(j=0; j < =log(n); j++) { . . . loop-body . . . } sum = blah(n/2); sum += blah(n/2); return sum; } Ans : Let us assume that the time complexity of blah(n) is T(n). The upper for loop iterates n times and inner one for log(n) times. therefore the cost of the for loop will be nlog ( n ). Similarly, the costs of two blah(n/2) function call is 2T(n/2). Therefore, The recurrence realtion is [ T ( n ) = 2 T ( n/ 2) + nlog ( n )] Using recurrence tree menthod to solve the above recurrence realation, The cost for level 1 willl be : [ nlog ( n )] The cost for level two will be : 2 * n/ 2 * log ( n/ 2) = n [ log ( n ) - 1] For level 3 : n [ log ( n ) - 2] . . . for log(n) times . . . We get T ( n ) = n log n + n ((log n ) - 1) + ..... + n (log n - log n ) Solving we get: T ( n ) = n log n

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
(2) int blah(int n) { int sum = 0; int i, j; if (n == 0) return 7; for(i=0; i < =n-1; i++) sum += blah(i); return sum; } Ans : Lets take cost c 1 n for thr loop and rest of the abovce statements take constant time therefore we ignore them. Also, For blah(n) cost is n. so, for the statement sum + = blah ( i ) the cost will be n - 1 X i =0 i Summming them up, we get : = c 1 n + n - 1 X i =0 i = c 1 n + n ( n - 1) 2 = O ( n 2 ) Problem 2 Prove or ﬁnd a counterexample for the following. Assume that f ( n ) and g ( n ) are grater than 1 and monotonically increasing functions. (1) f ( n ) = o ( g ( n )) implies log( f ( n )) = o (log( g ( n ))) Ans : We know that f ( n ) = o ( g ( n )) implies f ( n ) < cg ( n ) for all n n 0 and any c > 0 Lets take c = 1, we get f ( n ) < g ( n ) since f(n) and g(n) are monotonically increasing and > 1, we can take log on both sides without changes in inequality sign, log( f ( n )) < log( g ( n )) —(I) Also, we have log( f ( n )) = o (log( g ( n ))) i.e. log( f ( n )) < c.log ( g ( n )) ——(II) Thus for c = 1, the above equation and (I) are same,So it holds true. (2) f ( n ) = O ( g ( n )) implies log( f ( n )) = O (log( g ( n ))) Ans :
We know that f ( n ) = O ( g ( n )) implies f ( n ) cg ( n ) for all n n 0 and any c > 0 Lets take c = 1, we get f ( n ) g ( n ) since f(n) and g(n) are monotonically increasing and > 1, we can take log on both sides without changes in inequality sign, log( f ( n )) log( g ( n )) —(I) Also, we have log( f ( n )) = O (log( g ( n ))) i.e. log( f ( n )) c.log ( g ( n )) ——(II) Thus for c = 1, the above equation and (I) are same,So it holds true. (3)

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
This is the end of the preview. Sign up to access the rest of the document.

## This note was uploaded on 10/18/2010 for the course CSE 548 taught by Professor Ko,k during the Spring '08 term at SUNY Stony Brook.

### Page1 / 11

CSE_548-AnalysisOfAlgorithm-fall2010 - Michael A. Bender...

This preview shows document pages 1 - 4. Sign up to view the full document.

View Full Document
Ask a homework question - tutors are online