11/17/10
20:56:10
1
21
CS 61B: Lecture 21
Friday, October 15, 2010
ASYMPTOTIC ANALYSIS (continued):
More Formalism
================================================

 Omega(f(n)) is the set of all functions T(n) that satisfy:




There exist positive constants d and N such that, for all n >= N,


T(n) >= d f(n)


^^^^^^^^^^
Compare with the definition of BigOh:
T(n) <= c f(n). ^^^^^^^^^^^
Omega is the reverse of BigOh.
If T(n) is in O(f(n)), f(n) is in Omega(T(n)).
2n
is in Omega(n)
BECAUSE
n
is in O(2n).
n^2
is in Omega(n)
BECAUSE
n
is in O(n^2).
n^2
is in Omega(3 n^2 + n log n)
BECAUSE
3 n^2 + n log n
is in O(n^2).
BigOmega gives us a LOWER BOUND on a function, just as BigOh gives us an
UPPER BOUND.
BigOh says, "Your algorithm is at least this good."
BigOmega
says, "Your algorithm is at least this bad."
Recall that BigOh notation can be misleading because, for instance,
n is in O(n^8).
If we know both a lower bound and an upper bound for
a function, and they’re both the same bound asymptotically (i.e. they differ
only by a constant factor), we can use BigTheta notation to precisely specify
the function’s asymptotic behavior.

 Theta(f(n)) is the set of all functions that are in both of




O(f(n)) and Omega(f(n)).


But how can a function be sandwiched between f(n) and f(n)?
Easy:
we choose different constants (c and d) for the upper bound and lower
bound.
For instance, here is a function T(n) in Theta(n):
c f(n) = 10 n
^
/

/
T(n)

/
**

/
*
*

/
***
*
**

/
*
*
*

***
/
*
*
*
 **
** /
*
*
*
*
**
*
*
*
*
/
*
*
**
*

/
**
***
˜˜˜

/
˜˜˜˜˜

/
˜˜˜˜˜

/
˜˜˜˜˜

/
˜˜˜˜˜
d f(n) = 2 n
 /
˜˜˜˜˜
/ ˜˜˜˜˜
O˜˜> n
This preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
This is the end of the preview.
Sign up
to
access the rest of the document.
 Fall '01
 Canny
 Data Structures, Theta, Big O notation, Analysis of algorithms

Click to edit the document details