extra big-oh notes

# extra big-oh notes - Lecture 14 Big-Oh Doina Precup With...

This preview shows pages 1–3. Sign up to view the full content.

Lecture 14: Big-Oh Doina Precup With many thanks to Prakash Panagaden and Mathieu Blanchette February 11, 2008 So far we have talked about O () informally, as a way of capturing the worst- case computation time of an algorithm. We have seen that this notation only takes into account the ”fastest-growing” components of the actual running time, and that constants do not matter. Today we give a formal definition of big-oh and some results which help us compute it. Finally, we will see that there is a whole “hierarchy” of complexity classes. 1 Intuition and definition We want to use the O () notation in order to say whether a function f grows slower than a function g . If so, we want to say that f ( n ) is “of order” g ( n ) (denoted O ( g ( n )) ). So far, we have been using this mainly to talk about computation time, but we will also use it for memory usage in the future. In general, we can define this concept by talking about general functions, even without having a computa- tional application in mind. To get some intuition, consider the example in Figure 1. Function f is grow- ing slower than g (the top line), but this does not mean that f is always smaller. However, after some point n 0 , it really is always smaller. In general, we can allow g to be multiplied by a positive constant c : the effect of this will not affect the or- der of growth. For example, if f is a quadratic function and g is a linear function, intuitively no matter how much we multiply g , f will overtake it eventually. This motivates the following definition: Definition (Big-Oh): Function f ( n ) is O ( g ( n )) if and only if there exist a constant c > 0 and a constant natural number n 0 such that: n n 0 , f ( n ) cg ( n ) 1

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
Figure 1: Function f ( n ) grows slower than g ( n ) Note that c and n 0 must be constants (i.e., they do not depend on n ). We can indeed think of O ( g ( n )) as the set of all functions f ( n ) that are O ( g ( n )) : O ( g ( n )) = { f ( n ) |∃ c > 0 , n 0 > 0 s.t. n n 0 , f ( n ) cg ( n ) } Hence, we can use the notation f ( n ) O ( g ( n )) to denote the fact the f ( n ) is O ( g ( n )) . As we will see below, these sets are infinite. 2
This is the end of the preview. Sign up to access the rest of the document.

{[ snackBarMessage ]}

### Page1 / 7

extra big-oh notes - Lecture 14 Big-Oh Doina Precup With...

This preview shows document pages 1 - 3. Sign up to view the full document.

View Full Document
Ask a homework question - tutors are online