This preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: CS161  Big O David Kauchak We need a way to talk about that computational cost of an algorithm that focuses on the essential parts and ignores details that are not relavent and that is somewhat agnostic to the underlying hardware. How would you answer the question, what is the running time of algorithm x ? We saw some of this last time in our examination of InsertionSort and MergeSort . • Asymptotic notation: – Precisely calculating the actual steps is tedious and not generally useful – Different operations take different amounts of time. Even from run to run, things such as caching, etc. will complicate things – Want to identify categories of algorithmic runtimes – Compare different algorithms f 1 ( n ) takes n 2 steps f 2 ( n ) takes 3 n + 100 steps f 3 ( n ) takes 2 n + 1 steps Which algorithm is better? Is the difference between f 2 and f 3 impor tant/significant? • Runtime examples 1 n n log n n 2 n 3 2 n n ! n = 10 < 1 sec < 1 sec < 1 sec < 1 sec < 1 sec 4 sec n = 30 < 1 sec < 1 sec < 1 sec < 1 sec < 18 min 10 25 years n = 100 < 1 sec < 1 sec 1 sec 1s 10 17 years very long n = 1000 < 1 sec < 1 sec 1 sec 18 min very long very long n = 10 , 000 < 1 sec < 1 sec 2 min 12 days very long very long n = 100 , 000 < 1 sec 2 sec 3 hours 32 years...
View
Full Document
 Summer '09
 Algorithms, Big O notation, Thomas H. Cormen, Introduction to Algorithms, Charles E. Leiserson

Click to edit the document details