Lec04_05_GrowthOrder

# Lec04_05_GrowthOrder - Asymptotic Growth Rate Asymptotic...

This preview shows pages 1–13. Sign up to view the full content.

Asymptotic Growth Rate

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
Asymptotic Running Time The running time of an algorithm as input size approaches infinity is called the asymptotic running time We study different notations for asymptotic efficiency. In particular, we study tight bounds, upper bounds and lower bounds.
Outline Why do we need the different sets? Definition of the sets O (Oh), (Omega) and Θ (Theta), o (oh), ϖ (omega) Classifying examples: Using the original definition Using limits

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
The functions Let f ( n ) and g ( n ) be asymptotically nonnegative functions whose domains are the set of natural numbers N={0,1,2, …}. A function g ( n ) is asymptotically nonnegative , if g ( n ) 0 for all n n 0 where n 0 N
The “sets” and their use – big Oh Big “oh” - asymptotic upper bound on the growth of an algorithm When do we use Big Oh? 1. Theory of NP-completeness 2. To provide information on the maximum number of operations that an algorithm performs Insertion sort is O(n 2 ) in the worst case This means that in the worst case it performs at most cn 2 operations

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
Definition of big Oh O ( f ( n )) is the set of functions g ( n ) such that: there exist positive constants c, and N for which, 0 g ( n ) cf ( n ) for all n N f ( n ) is called an asymptotically upper bound for g ( n ).
g ( n ) O( f ( n )) N n cf ( n ) g ( n ) I grow at most as fast as f

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
n 2 + 10 n O ( n 2 ) Why? 0 200 400 600 800 1000 1200 1400 0 10 20 30 n 2 + 10n 2 n 2 take c = 2 N = 10 n 2 +10n <=2 n 2 for all n >=10
Does 5n+2 O(n)? Proof : From the definition of Big Oh, there must exist c>0 and integer N >0 such that 0 5 n +2 cn for all n N . Dividing both sides of the inequality by n>0 we get: 0 5+2/ n c . 2/n 2, 2/ n >0 becomes smaller when n increases There are many choices here for c and N .

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
Does 5n+2 O(n)? If we choose N =1 then 5+2/ n 5+2/1= 7. So any c 7. Choose c = 7. If we choose c =6, then 0 5+2/ n 6 . So any N 2. Choose N = 2. In either case (we only need one!) we have a c >0 and N >0 such that 0 5 n +2 cn for all n N . So the definition is satisfied and 5 n +2 O(n)
Does n 2 O(n)? No. We will prove by contradiction that the definition cannot be satisfied. Assume that n 2 O(n). From the definition of Big Oh, there must exist c>0 and integer N>0 such that 0 n 2 cn for all n N . Dividing the inequality by n >0 we get 0 n c for all n N . n c cannot be true for any n >max{ c,N }, contradicting our assumption So there is no constant c >0 such that n c is satisfied for all n N , and n 2 O(n)

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
Are they true? 1,000,000
This is the end of the preview. Sign up to access the rest of the document.

{[ snackBarMessage ]}

### Page1 / 62

Lec04_05_GrowthOrder - Asymptotic Growth Rate Asymptotic...

This preview shows document pages 1 - 13. Sign up to view the full document.

View Full Document
Ask a homework question - tutors are online