02analysis

# 02analysis - CS180 Fall 2009 Basics of Running-Time...

This preview shows pages 1–7. Sign up to view the full content.

CS180: Fall 2009 Basics of Running-Time Analysis Asymptotic Notation Polynomial Running Time Solving Recurrence Relations Heaps and Priority Queues

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
2 Computational Tractability Charles Babbage (1864) As soon as an Analytic Engine exists, it will necessarily guide the future course of the science. Whenever any result is sought by its aid, the question will arise - By what course of calculation can these results be arrived at by the machine in the shortest time? - Charles Babbage Analytic Engine (schematic)
3 Measuring the Running Time Suppose we are given an algorithm. We’d like to determine whether this algorithm is “fast” -- in general it is possible to do many things via exhaustive search, but this is not really a good approach. The obvious approach is to try our algorithm on examples and see how long it takes. There are several problems with this: 1. This is very sensitive to the time per operation (computer speed?) 2. Maybe the algorithm only performs well on small examples. We want a notion of speed which is independent of the computer we run on, or the examples we select.

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
4 Running Time by Counting Operations An algorithm is written as a sequence of simple operations. One way to measure running time is simply to count the number of operations performed. This is not quite as simple as counting the number of lines of code. 1. Frequently algorithms include loops. The number of operations for a loop can be obtained by multiplying the number of operations inside the loop by the number of times through the loop. 2. Frequently algorithms contain “if…then” type statements. These statements may take different amounts of time depending on the resolution of the “if…then.” Bounding these can be tricky… 3. Some algorithms use recursion (call themselves). Determining the number of operations may involve solving an algebraic equation. 4. Note that often the number of operations will depend upon the size of the input!
5 Inherent Assumptions Not all operations are created equal. In general we will make a few assumptions about basic operations: 1. Jumping to some specific line of the algorithm takes one operation. 2. Performing a math operation (add, divide, log) takes one operation. 3. Assigning a value to a specific array element takes one operation. Keep in mind that if you’re dealing with very large integers (or very high-precision real numbers, or very large arrays) these assumptions may not hold!

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
6 Which Running Times are “Fast”? Suppose we’ve determined running times for two different algorithms. For example, suppose one requires 20n operations and the other requires n 2 operations, where n is the size of the input. Which is “faster”? Of course, we can answer this question by plotting the two functions
This is the end of the preview. Sign up to access the rest of the document.

{[ snackBarMessage ]}

### Page1 / 23

02analysis - CS180 Fall 2009 Basics of Running-Time...

This preview shows document pages 1 - 7. Sign up to view the full document.

View Full Document
Ask a homework question - tutors are online