COT 4210
Section C
Spring 2001
Theorem 2 established that NFAs are no more “powerful” than DFAs in terms of the family of
languages they can recognize.
However, NFAs are more compact and succinct than DFAs
because of the exponential relationship between the number of states required for a DFA to
recognize the same language as an NFA.
We will return to this issue later, when we study the
algorithm for minimizing DFAs.
Equivalence of NFAs to Rightlinear Grammars
To establish the first equivalence suggested by the Chomsky Hierarchy, we now show that the
Regular languages (defined by DFAs and NFAs) is exactly the same family of languages that can
be defined by Type3 or Rightlinear grammars.
This is the subject of our next theorem.
Theorem 3.
Let M = (Q,
Σ
,
δ
,Q
0
, A) be an arbitrary NFA, then there is a Rightlinear grammar
G = (N,
Σ
, P, S), such that L(G) = L(M).
Conversely, if
G is an arbitrary RLG, then one can
construct an NFA, M, for which L(M) = L(G).
Proof.
Let M be given.
We construct G from M essentially by (a) introducing a nonterminal for
each state of M, and (b) introducing one production for each transition of M.
Formally, we
define
N = Q
∪
{S}, where S denotes the start symbol of G and is distinct from all symbols
denoting states of M.
P is then defined to be the union of three sets of rules denoted P[1] , P[2],
and P[3].
P[1] = {
S
→
q

q
∈
Q
0
},
P[2] = {
q
→σ
q’

q’
∈
δ
M
(q,
σ
) for some
σ∈
Σ
∪
{
Λ
} }, and
P[3] = {
q
→
λ

q
∈
A }
To show that L(G) = L(M), we have to show that
x
∈
L(G) if and only if x
∈
L(M).
Suppose
that x
∈
L(M).
Then there is some accepting computation of x, that is there is some sequence of
configurations:
(q
0
, y
0
)
M
⇒
(q
1
, y
1
)
M
⇒
…
M
⇒
(q
m
, y
m
) = (q
m
,
λ
), where q
0
∈
Q
0
, y
0
= x, and
q
m
∈
A.
Then in G we have the following derivation:
S
G
⇒
q
0
G
⇒
π
y
0
q
m
G
⇒
y
0
= x.
The first
step of the derivation applies a rule in P[1] to rewrite S as the particular initial state (q
0
) of M that
determines an accepting computation.
Each rule of
π
corresponds to a move of M in the
accepting computation – whatever input,
σ
, is consumed by M on any given move,
σ
will be
produced as output by the corresponding rule in P[2]; if
σ
=
Λ
, then no input is consumed by M
and nothing is written to the sentential form by G.
Thus, if M consumes y
0
, G will generate y
0
by
mimicking the same transitions, but opposite in the IO sense.
Finally, the last rule of the
derivation is a rule in P[3] – these rules permit the derivation in G to terminate if and only if M is
in an accept state.
Using a similar argument it is easily shown that any string x generated by G can also be accepted
by M.
Thus M and G are equivalent specifications for the same language.
The converse result is established in a similar fashion, but there are some details that are
different.
So, let us be given an arbitrary RLG, G = (N,
Σ
, P, S).
To construct M we will
introduce a state for each nonterminal, analogous to our previous construction of G from M.