This preview shows pages 1–3. Sign up to view the full content.
Recursive Algorithms II
Margaret M. Fleck
29 March 2010
This lecture wraps up our discussion of algorithm analysis (section 4.4
and 7.1 of Rosen).
1
Announcements
Quiz should be graded soon. More information as soon as I have clear infor
mation about how close to done it is.
Homework 8 is posted, due this Friday. Reminder that honors homework
2 is due next Monday (the 5th).
The second midterm will be a week from Wednesday (7 April). 79pm in
141 Wohlers, just like the ±rst midterm. Study materials will be posted very
soon. If you need special arrangements (e.g. extra time), contact me and (if
necessary) book a room at DRES.
Our ±nal exam will be 710pm on Friday May 7th. The con²ict ±nal will
be Monday the 10th, 1:304:30pm. As the end of term approaches, watch
for further instructions (e.g. because the main exam involves two rooms in
di³erent buildings).
2
Recap
Last lecture, we started talking about mergesort, a sorting algorithm whose
bigO running time is
O
(
n
log
n
), i.e. better than the
O
(
n
2
) running time of
bubblesort and insertion sort. This is, in fact, the best bigO running time
you can have for a sorting algorithm.
The idea behind mergesort is to divide the (big) input list into two half
size lists. We keep dividing in half until we have a large collection of very
1
This preview has intentionally blurred sections. Sign up to view the full version.
View Full Documentsmall lists. In our pseudocode these basecase lists will be length 1. Real
implementations often stop with lists of length 2 or even slightly longer,
because it is easy to write code to directly sort very short lists.
Once the tiny lists are all sorted (which is automatically true for lists of
length 1), then we start merging them back together. We Frst merge the
tiny lists in pairs to make lists twice as long. We keep merging pairs together
until we get one big (sorted) list.
[quick demo of the algorithm for merging two sorted lists]
±or sorting small to medium lists of numbers, the
O
(
n
2
) quicksort al
gorithm typically runs fastest, because it has an average running time of
O
(
n
log
n
) and very good constants. Mergesort’s constants are only soso.
Mergesort is interesting because it’s has
O
(
n
log
n
) worstcase running time
and is easy to analyze. Also, because the key merge step accesses its three
lists in storage order, variations of this technique are used when sorting data
that is stored in places that are slow to access. In the Bad Old Days, “slow
storage” was usually magnetic tapes. These days, it’s often places that are
somewhere else on the internet.
3
This is the end of the preview. Sign up
to
access the rest of the document.
 Spring '10
 fleck
 Algorithms

Click to edit the document details