This preview shows pages 1–2. Sign up to view the full content.
03/03/09
14:57:33
1
20
CS 61B:
Lecture 20
Monday, March 9, 2009
Today’s reading:
ASYMPTOTIC ANALYSIS (bounds on running time or memory)
===================
Suppose an algorithm for processing a retail store’s inventory takes:
 10,000 milliseconds to read the initial inventory from disk, and then
 10 milliseconds to process each transaction (items acquired or sold).
Processing n transactions takes (10,000 + 10 n) ms.
Even though 10,000 >> 10,
we sense that the "10 n" term will be more important if the number of
transactions is very large.
We also know that these coefficients will change if we buy a faster computer or
disk drive, or use a different language or compiler.
We want a way to express
the speed of an algorithm independently of a specific implementation on a
specific machinespecifically, we want to ignore constant factors (which get
smaller and smaller as technology improves).
BigOh Notation (upper bounds on a function’s growth)

We use BigOh notation to say how slowly code might run as its input grows.
Let n be the size of a program’s _input_ (in bits or data words or whatever).
Let T(n) be a function.
For now, T(n) is precisely equal to the algorithm’s
running time, given an input of size n (usually a complicated expression).
Let f(n) be another functionpreferably a simple function like f(n) = n.
We say that T(n) is in O( f(n) )
IF AND ONLY IF
T(n) <= c f(n)
WHENEVER n IS BIG, FOR SOME LARGE CONSTANT c.
*
HOW BIG IS "BIG"?
Big enough to make T(n) fit under c f(n).
*
HOW LARGE IS c?
Large enough to make T(n) fit under c f(n).
EXAMPLE:
Inventory

Let’s consider the function T(n) = 10,000 + 10 n, from our example above.
Let’s try out f(n) = n, because it’s simple.
This preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
This is the end of the preview. Sign up
to
access the rest of the document.
 Spring '01
 Canny
 Data Structures

Click to edit the document details