ECS 120 Lesson 22 – Time Complexity
Oliver Kreylos
Monday, May 21th, 2001
Until now, we investigated the
computability
of problems – the question
whether a certain problem can be solved by some algorithm. We defined a
problem to be decidable if and only if there exists a Turing Machine that
halts on every input word and that accepts exactly those words that are
instances of the problem. In doing so, we neglected the fact that, even if a
problem is decidable, it might take a Turing Machine a very long time to halt
on some input words. Some decidable problems require any Turing Machine
deciding them to take so many steps that solving them is impractical for
but the most simple instances. From now on, we will focus our attention on
decidable problems and analyze the runningtime of Turing Machines solving
them – that is, the number of computation steps a Turing Machine has to
perform to accept or reject an input word of some length. Our goal is to be
able to estimate how long a computation will take before actually performing
it. This means, we need tools to estimate runningtime from the description
of an algorithm alone.
1
Runningtime Analysis of Algorithms
As a running example, let us consider the language
L
=
0
n
1
n
n
≥
0
.
We have seen that this language is contextfree, so it must be decidable. A
Turing Machine deciding this language is given by the following algorithm:
Algorithm
L
=
0
n
1
n
n
≥
0
On input
w
,
1. Mark the beginning of the tape.
2. Scan until the end of input and reject if any zero appears after any one,
or if any other characters but zeros and ones appear in the input word.
1
This preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
3. Return the tape head to the leftmost position.
4. Scan to the right until the first noncrossedout character is encoun
tered. If it is a blank, accept. If it is a one, reject.
5. Cross out the current character (a zero).
6. Scan to the right until the first character that is neither crossed out
nor a zero is encountered. If it is a blank, reject.
7. Cross out the current character (a one).
8. Repeat from step 3.
If given an algorithm like the one above, we are now interested in how long
a Turing Machine implementing this algorithm would run, or more precisely,
how many computation steps it would perform before halting. It is apparent
that the number of steps a Turing Machine performs depends on the specific
input word. Therefore, every deciding Turing Machine
M
defines a
running
time
function rt
M
: Σ
*
→
N
, that returns rt
M
(
w
), the number of steps
M
performs on input word
w
.
1.1
Input Size
The runningtime function rt
M
as defined above is too difficult to compute
for practical purposes – in general, the only way to evaluate it for a word
w
is
to actually run machine
M
on word
w
. This defeats the purpose of running
time analysis: We want to be able to estimate how long a computation will
take before actually starting it. The first simplification is to not evaluate the
runningtime for a specific input word, but for a class of input words of the
same
size
. The exact definition of size depends on the particular problem:
•
This is the end of the preview.
Sign up
to
access the rest of the document.
 Spring '07
 Filkov
 Big O notation, Analysis of algorithms, Turing Machines, input word

Click to edit the document details