L8_print

# L8_print - CSE 12: Basic data structures and...

This preview shows pages 1–3. Sign up to view the full content.

CSE 12 : Basic data structures and object-oriented design Jacob Whitehill jake@mplab.ucsd.edu Lecture Ate 11 Aug 2011 Thursday, August 11, 2011 More on performance analysis. Thursday, August 11, 2011 Asymptotic performance analysis Asymptotic performance analysis is a coarse but useful means of describing and comparing the performance of algorithms as a function of the input size n when n gets large. Asymptotic analysis applies to both time cost and space cost . Asymptotic analysis hides details of timing (that we don’t care about) due to: Speed of computer. Slight differences in implementation. Programming language. Thursday, August 11, 2011 O, Ω , and θ In order to justify describing the time cost T ( n ) =3 n +4 as just “linear” ( n ), we f rst need some mathematical machinery: We de f ne a lower bound on T with Ω . We de f ne an upper bound on T with O . We de f ne a tight bound (bounded above and below) on T with θ . θ is important because it is more speci f c than O . (For example, technically, 3 n +4= O (2 n ).) Thursday, August 11, 2011 Abuse of notation When we say that 3 n +5 is “linear in n ”, what we really mean (mathematically) is that 3 n +5 is θ ( n ). Note : In computer science, we often say O where we really mean θ . This is a slight abuse of notation . We will use O in this course to mean θ . Thursday, August 11, 2011 Asymptotic analysis assigns algorithms to different “complexity classes”: O (1) - constant - performance of algorithm does not depend on input size. O ( n ) - linear - doubling n will double the time cost. O (log n ) - logarithmic O ( n 2 ) - quadratic O (2 n ) - exponential Algorithms that differ in complexity class can have vastly different run-time performance (for large n ). Asymptotic performance analysis Thursday, August 11, 2011 Analysis of data structures Let’s put these ideas into practice and analyze the performance of algorithms related to ArrayList : add(o) , get(index) find(o) , and remove (index) . As a f rst step, we must decide what the “input size” means. What is the “input” to these algorithms? Thursday, August 11, 2011 Analysis of data structures Each of the methods (algorithms) above operates on the _underlyingStorage and either o or index . o and index are always length 1 -- their size cannot grow . However, the number of data in _underlyingStorage (stored in _numElements ) will grow as the user adds elements to the ArrayList . Hence, we measure asymptotic time cost as a function of n , the number of elements stored ( _numElements ). Thursday, August 11, 2011

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
Adding to back of list class ArrayList<T> { ... void addToBack (T o) { // Assume _underlyingStorage is big enough _underlyingStorage[_numElements] = o; _numElements++; } } What is the time complexity of this method? Thursday, August 11, 2011 Adding to back of list class ArrayList<T> { ...
This is the end of the preview. Sign up to access the rest of the document.

## L8_print - CSE 12: Basic data structures and...

This preview shows document pages 1 - 3. Sign up to view the full document.

View Full Document
Ask a homework question - tutors are online