If L
is regular, then there exists some regular grammar which generates it. It can be always represented as either a left-regular grammar, or a right-regular grammar. Let's assume that it's left-regular grammar G_l
(the proof for right-regular grammar is analogous).
This grammar has productions of two types; the terminating-type:
A -> a, where A is non-terminal and a is either a terminal or empty string (epsilon)
or the chaining type:
B -> Ca, where B, C are non-terminals and a is a terminal
When we apply reverse to a regular language, we basically also apply it to the tails of productions (since heads are just single non-terminals). It's going to be proved later on. So we get a new grammar G_r
, with productions:
A -> a, where A is non-terminal and a is either a terminal or empty string (epsilon)
B -> aC, where B, C are non-terminals and a is a terminal
But hey, it's a right-regular grammar! So the language it accepts is also regular.
There is one thing to do - to show that reversing tails actually does the thing it's supposed to. We're going to prove it very simply:
If L
contains \epsilon, then there is production 'S -> \epsilon' in G_l
. Since we don't touch productions like that, it's also present in G_r
.
If L
contains a
, a word composed of a single terminal, then it's similar to the above
If L
contains aZ
, where a is a terminal and Z
is a word from the language constructed from chopping off the first terminals out of words in L
, then L^r
contains (because of changes to the chaining productions) (Z^r)a
. Z is also a regular language, since it can be constructed by dropping the first "level" of left-productions from G_l
, which leaves us with a regular grammar.
I hope it helped. There's also an arguably easier way of doing that by reversing edges of the relevant finite automata and changing accepting and entry states a bit.