{[ promptMessage ]}

Bookmark it

{[ promptMessage ]}

CS300-03_Algorithm_Analysis

# CS300-03_Algorithm_Analysis - 3 Asymptotic Growth Rate How...

This preview shows pages 1–7. Sign up to view the full content.

1 3. Asymptotic Growth Rate How good is our measure of work done to compare algorithms ? How precisely can we compare two algorithms using our measure of work ? Measure of work # of passes of a loop # of basic operations Time complexity c (measure of work)

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
2 For solving a problem P , suppose that two algorithms A 1 and A 2 need 10 6 n and 5 n basic operations, respectively, i.e. basic operations A 1 10 6 n A 2 5 n Which one is better ? What are their time complexities ? A 1 10 6 n O( n ) A 2 5 n O( n ) Does this mean that A 1 ( A 2 ) is as good as A 2 ( A 1 ) ? Well, …… Both # of basic operations (loops) and time complexity are imprecise !!!
3 Now, suppose that algorithms A 1 and A 2 need the following amount of time: time complexity A 1 10 6 n O( n ) A 2 n 2 O( n 2 ) Which one is better ? A 1 is better if n > 10 6 A 2 is better if n < 10 6 Then, why time complexity ? Suppose that n Then, n 2 grows much faster than 10 6 n . i.e., Under the assumption that n A 1 is better than A 2 Asymptotic growth rate !!! Time complexity (measure of work) compares and classifies algorithms by the asymptotic growth rate. n n n 6 2 10 lim 10 6 n T ( n )

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
4 N = {0, 1, 2, …} N + = {1, 2, 3, …} R = the set of real numbers R + = the set of positive real numbers R * = R + {0} f : N R * and g : N R * g is: ( f ): g grows at least as fast as f . θ ( f ): g grows as the same rate as f . O( f ): g grows no faster than f .
5 Definition : Let f : N R * . O( f ) is the set of functions, g : N R * such that for some c R + and some n 0 N , g ( n ) c f ( n ) for all n n 0 . O( f ) is usually called “big oh of f ”, “oh of f ”, or “order of f ”. Note: In other books, g ( n ) = O( f ( n )) if and only if there exist two positive constants c and n 0 such that | g ( n )| c | f ( n )| for all n n 0 Under the assumption that f : N R * and g: N R * , two definitions have a minor difference. How to check n 2 , 10 5 n 2 - n , n 2 + 10 10 , 10 3 n 2 + n - 1 O( n 2 ) Is 10 10 n O( n 2 ) ? ) ( ' ) ( ' lim ) ( ) ( lim rule s Hopital' L' By : note ) ( O , ) ( ) ( lim * n f n g n f n g f g c c n f n g n n n = = R What is it ?

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
6 Definition : Let f : N R * . ( f ) is the set of functions, g : N R * such that for some c R + and some n 0 N , g ( n ) c f ( n ) for all n n 0 . ( f ) is usually called “big omega of f ” or “omega of f ”. Note: In other books, g ( n ) = ( f ( n )) if and only if there exist two positive constants c and n 0 such that | g ( n )| c | f ( n )| for all n n 0 How to check 10 5 n , n 2 , n 2 + 10 n + 10 6 ( n ). Are they also (log n ) ?
This is the end of the preview. Sign up to access the rest of the document.

{[ snackBarMessage ]}