# comp17 - Computational Complexity Algorithmic Cost and...

This preview shows pages 1–3. Sign up to view the full content.

Algorithmic Cost and Complexity Algorithms can be analyzed from two different points of view: time or space . For the most part we are more interested in time than space. o Time: each instruction takes time. How many instructions are executed by the algorithm? How fast does the algorithm run? What affects the run-time of the algorithm? o Space: data structures require space. What kind of data structures are used? How does the choice of data structure affect the run- time? Measures of work: Worst case performance, best case performance, and average performance. o What situations will produce worst case performance? What situations will produce best case performance? What is average performance? How is average performance determined? Algorithm Analysis In some of the labs this term and in class we have looked at techniques where we determined, essentially the number of times a statement in a loop is executed. Remember, that baring some type of looping statement, the code in a program (algorithm) is executed sequentially. This means that without loops, each statement is executed exactly one time and the running time of the algorithm is very easy to establish. Loops cause iteration and iteration increases the running time depending on how much iteration occurs. Therefore, we need to know, for the statements inside the loop, how many times they are executed. Consider the two code segments shown below: What is the number of addition operations performed in these two code segments? Computational Complexity - 1 Computational Complexity Segment #1: grandtotal = 0; for (k = 0; k < n – 1; ++k) { rows[k] = 0; for (j = 0; j < n – 1; ++j) rows[k] = rows[k] + matrix[k][j]; grandtotal = grandtotal + matrix[k][j]; } }

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
For segment #1 the number of addition operations is 2n 2 while for segment #2 the value is n 2 + n. Assume that we are working with a hypothetical computer that requires 1 microsecond (10 -6 ) seconds to perform an addition. If the value of n = 1000 the segment #1 would require just over 2 seconds to execute [(2*(1000) 2 )inst] * 10 -6 sec/inst = 2 seconds. On the other hand, segment #2 would require just over 1 second [(1000) 2 + 1000] inst * 10 -6 sec/inst = 1.001 seconds. If the value of n is increased to 100,000 then code segment #1 would require about 6 hours and code segment #2 would require about 3 hours. The table shown below gives the Big-Oh complexity
This is the end of the preview. Sign up to access the rest of the document.

## This document was uploaded on 06/12/2011.

### Page1 / 8

comp17 - Computational Complexity Algorithmic Cost and...

This preview shows document pages 1 - 3. Sign up to view the full document.

View Full Document
Ask a homework question - tutors are online