Homework 2
Solutions
Fundamental Algorithms, Spring 2008, Professor Yap
Due: Wed Feb 20, in class.
HOMEWORK with SOLUTION, prepared by Instructor and T.A.s
INSTRUCTIONS:
•
Please read questions carefully. When in doubt, please ask. You may also post general questions to
the homework discussion forum in class website. Also, bring your questions to recitation on Monday.
•
There are links from the homework page to the old homeworks from previous classes, including solu
tions. Feel free to study these.
1. (4 Points)
Exercise A.3 in Lecture 1. (De Morgan’s Law applied to quantifiers)
Consider the following sentence:
(
∀
x
∈
Z
)(
∃
y
∈
R
)(
∃
z
∈
R
)
h
(
x >
0)
⇒
((
y < x < y

1
)
∧
(
z < x < z
2
)
∧
(
y < z
))
i
(1)
Note that the range of variable
x
is
Z
, not
R
. This is called a
universal sentence
because the leading
quantifier is the universal quantifier (
∀
). Similarly, we have
existential sentence
.
(i)
Negate the sentence (1), and then apply De Morgan’s law to rewrite the result as an existential
sentence.
(ii)
Give a counter example to (1).
(iii)
By changing the clause “(
x >
0)”, make the sentence true. Indicate why it would be true.
SOLUTION:
(i)
(
∃
x
∈
Z
)(
∀
y
∈
R
)(
∀
z
∈
R
)
bracketleftbig
(
x >
0)
∧
(
¬
(
y < x < y
−
1
)
∨¬
(
z < x < z
2
)
∨
(
z
≤
y
))
bracketrightbig
(ii) A counter example is
x
= 1. If
z < x
, then we must have
z <
1 and so
z
2
< z
.
Now,
z
must be positive, as otherwise, it would not satisfy the condition
y < z
. Hence
0
< z <
1. Thus (
z < x < z
2
) cannot hold.
(iii) We change (
x >
0) to (
x >
1). This removes the counter example. Nor we can
always choose
z
so that (
z < x < z
2
). Now if we choose a positive
y
sufficiently small,
we can also satisfy the remaining clauses.
2. (10 Points)
Exercise 7.6, Lecture 1. (Redoing question from hw1, with new assumptions)
Provide either a counterexample when false or a proof when true. The base
b
of logarithms is arbi
trary but fixed, and
b >
1. Unlike in hw1, we now assume the functions
f, g
are unbounded and
≥
0
eventually.
(a)
f
=
O
(
g
) implies
g
=
O
(
f
).
(b) max
{
f, g
}
= Θ(
f
+
g
).
(c) If
g >
1 and
f
=
O
(
g
) then ln
f
=
O
(ln
g
). HINT: careful!
(d)
f
=
O
(
g
) implies
f
◦
log =
O
(
g
◦
log). Assume that
g
◦
log and
f
◦
log are complexity functions.
(e)
f
=
O
(
g
) implies 2
f
=
O
(2
g
).
(f)
f
=
o
(
g
) implies 2
f
=
O
(2
g
).
(g)
f
=
O
(
f
2
).
(h)
f
(
n
) = Θ(
f
(
n/
2)).
1
This preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
SOLUTION:
Many false statements are now true:
(a)
False, as before.
(b)
True.
max(
f, g
)
≤
f
+
g
(ev), since both functions are
≥
0 (ev).
But
f
+
g
≤
2 max(
f, g
) (ev).
(c)
True. Since
f
=
O
(
g
), we have
f
≤
Cg
(ev) for some
C >
1. So lg
f
≤
lg
C
+ lg
g
.
As
g
is grows unbounded, we have
g
≥
C
(ev) and so lg
f
≤
lg
C
+ lg
g
= 2 lg
g
(ev).
Alternative proof (based on limits, which I generally avoid): As
f
is increasing,
eventually
f >
1, so ln(
f
)
>
0 and moreover, it too is an increasing function of
n
.
Looking at the limit as
n
goes to infinity of ln(
f
)
/
ln(
g
) = (1
/f
)
/
(1
/g
) =
g/f
=
C
,
we see that
ln
(
f
) =
O
(ln(
g
)) .
(d)
True, as before.
(e)
False, as before.
This is the end of the preview.
Sign up
to
access the rest of the document.
 Spring '10
 .
 Algorithms, Trigraph, partner, LG

Click to edit the document details