Algorithms (Computer Science Notes)
Introduction
•
We will figure out only the bigO running time for each algorithm, i.e. ignoring
multiplicative constants and behavior on small inputs. This will allow us to examine the
overall design of the algorithms without excessive complexity.
Basic Data Structures
•
In a linked list, each object points to the next object in the list. An algorithm has direct
access only to the elements at the ends of the list.1 Objects in the middle of the list can
only be accessed by walking element by element from one end, which can take O(n)
time. However, the length of the list is flexible and objects can added to, or removed
from, the ends of the list in constant time. Once you are at a position in the middle of a
list, objects can be added or deleted at that position in constant time
•
A linked list starts with its
head
and ends with its
tail.
For example, suppose our list is L
= (1, 7, 3, 4, 7, 19). Then
head
(L) is 1 and
tail
(L) is 19. The function
pop
removes and
returns the value at the head of a list. i.e. pop(L) will return 1 leave the list L containing
(7, 3, 4, 7, 19)
•
For some algorithms, the bigO performance does not depend on whether arrays or linked
lists are used. This happens when the number of objects is fixed and the objects are
accessed in sequential order. However, remember that a bigO analysis ignores
multiplicative constants. All other things being equal, arraybased implementations tend
to have smaller constants and therefore run faster
Nested Loops
•
Algorithms based on nested loops are the easiest to analyze
•
To analyze this code in bigO terms, first notice that the startup code in lines 14 and the
ending code in line 12 takes the same amount of time regardless of the input size n. So
we’ll say that it takes “constant time” or O(1) time. The block of code inside both loops
(lines 711) takes a constant time to execute once. So the bigO running time of this
algorithm is entirely determined by how many times the loops run. The outer loop runs n
times. The inner loop runs n times during each iteration of the outer loop. So the block of
code inside both loops executes O(n^2) times. I’ve used curly
≺
because this ordering
isn’t standard algebraic ≤. The ordering only works when n is large enough.
This preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentMerging Two Lists
•
When code contains a while loop, rather than a for loop, it can be less obvious how many
times the loop will run. For example, suppose we have two sorted lists, a1, . . . , ap and
b1, . . . , bq. We can merge them very efficiently into a combined sorted list. To do this,
we make a new third empty list to contain our merged output. Then we examine the first
elements of the two input lists and move the smaller value onto our output list
•
We keep looking at the first elements of both lists until one list is empty. We then copy
over the rest of the nonempty list
•
For merge, a good measure of the size of the input is the length of the output array n,
This is the end of the preview.
Sign up
to
access the rest of the document.
 Spring '08
 Erickson
 Algorithms, Multiplication, Divide and conquer algorithm, constant time

Click to edit the document details