Dfa minimization
Encyclopedia
In computer science
, more specifically in the branch of automata theory
, DFA minimization is the task of transforming a given deterministic finite automaton (DFA) into an equivalent DFA that has minimum number of states. Here, two DFAs are called equivalent if they describe the same regular language
. Several different algorithms accomplishing this task are known and described in standard textbooks on automata theory.
There are three classes of states can be removed/merged from the original DFA without affecting the language it accepts.
DFA minimization is usually done in three steps, corresponding to the removal/merger of the relevant states. Since the elimination of nondistinguishable states is computationally the most expensive one, it's usually done as the last step.
Unreachable states can be removed from the DFA without affecting the language that it accepts.
, partitioning the DFA states into groups by their behavior. These groups represent equivalence classes of the Myhill–Nerode equivalence relation
, whereby every two states of the same partition are equivalent if they have the same behavior for all the input sequences. That is, for every two states and that belong to the same equivalence class within the partition , it will be the case that for every input word , if one follows the transitions determined by from the two states and one will either be led to accepting states in both cases or be led to rejecting states in both cases; it should not be possible for to take to an accepting state and to a rejecting state or vice versa.
The following pseudocode
describes the algorithm:
The algorithm starts with a partition that is too coarse: every pair of states that are equivalent according to the Myhill–Nerode relation belong to the same set in the partition, but pairs that are inequivalent might also belong to the same set. It gradually refines the partition into a larger number of smaller sets, at each step splitting sets of states into pairs of subsets that are necessarily inequivalent.
The initial partition is a separation of the states into two subsets of states that clearly do not have the same behavior as each other: the accepting states and the rejecting states. The algorithm then repeatedly chooses a set from the current partition and an input symbol , and splits each of the sets of the partition into two (possibly empty) subsets: the subset of states that lead to on input symbol , and the subset of states that do not lead to . Since is already known to have different behavior than the other sets of the partition, the subsets that lead to also have different behavior than the subsets that do not lead to . When no more splits of this type can be found, the algorithm terminates.
The worst case
running time of this algorithm is , where is the number of states and is the size of the alphabet. This bound follows from the fact that, for each of the transitions of the automaton, the sets drawn from that contain the target state of the transition have sizes that decrease relative to each other by a factor of two or more, so each transition participates in of the splitting steps in the algorithm. The partition refinement
data structure allows each splitting step to be performed in time proportional to the number of transitions that participate in it. This remains the most efficient algorithm known for solving the problem, and for certain distributions of inputs its average-case complexity
is even better, .
Once Hopcroft's algorithm has been used to group the states of the input DFA into equivalence classes, the minimum DFA can be constructed by forming one state for each equivalence class. If is a set of states in , is a state in , and is an input character, then the transition in the minimum DFA from the state for , on input , goes to the set containing the state that the input automaton would go to from state on input . The initial state of the minimum DFA is the one containing the initial state of the input DFA, and the accepting states of the minimum DFA are the ones whose members are accepting states of the input DFA.
to reorder the states so that states in the same set of the new partition are consecutive in the ordering, and there are at most steps since each one but the last increases the number of sets in the partition. The instances of the DFA minimization problem that cause the worst-case behavior are the same as for Hopcroft's algorithm. The number of steps that the algorithm performs can be much smaller than , so on average (for constant ) its performance is or even depending on the random distribution on automata chosen to model the algorithm's average-case behavior.
(constructing only the reachable states of the converted DFA) leads to a minimal DFA for the same reversed language. Repeating this reversal operation a second time produces a minimal DFA for the original language. The worst-case complexity of Brzozowski's algorithm is exponential, as there are regular languages for which the minimal DFA of the reversal is exponentially larger than the minimal DFA of the language, but it frequently performs better than this worst case would suggest.
Computer science
Computer science or computing science is the study of the theoretical foundations of information and computation and of practical techniques for their implementation and application in computer systems...
, more specifically in the branch of automata theory
Automata theory
In theoretical computer science, automata theory is the study of abstract machines and the computational problems that can be solved using these machines. These abstract machines are called automata...
, DFA minimization is the task of transforming a given deterministic finite automaton (DFA) into an equivalent DFA that has minimum number of states. Here, two DFAs are called equivalent if they describe the same regular language
Regular language
In theoretical computer science and formal language theory, a regular language is a formal language that can be expressed using regular expression....
. Several different algorithms accomplishing this task are known and described in standard textbooks on automata theory.
Minimum DFA
For each regular language that can be accepted by a DFA, there exists a DFA with a minimum number of states (and thus a minimum programming effort to create and use) and this DFA is unique (except that states can be given different names.)There are three classes of states can be removed/merged from the original DFA without affecting the language it accepts.
- Unreachable states are those states that are not reachable from the initial state of the DFA, for any input string.
- Dead states are those nonaccepting states whose transitions for every input character terminate on themselves. These are also called Trap states because once entered there is no escape.
- Nondistinguishable states are those that cannot be distinguished from one another for any input string.
DFA minimization is usually done in three steps, corresponding to the removal/merger of the relevant states. Since the elimination of nondistinguishable states is computationally the most expensive one, it's usually done as the last step.
Unreachable states
The state p of DFA M=(Q, Σ, δ, q0, F) is unreachable if no such string w in ∑* exists for which p=δ(q0, w). Reachable states can be obtained with the following algorithm:Unreachable states can be removed from the DFA without affecting the language that it accepts.
Hopcroft's algorithm
One algorithm for merging the nondistinguishable states of a DFA, due to , is based on partition refinementPartition refinement
In the design of algorithms, partition refinement is a technique for representing a partition of a set as a data structure that allows the partition to be refined by splitting its sets into a larger number of smaller sets. In that sense it is dual to the union-find data structure, which also...
, partitioning the DFA states into groups by their behavior. These groups represent equivalence classes of the Myhill–Nerode equivalence relation
Myhill–Nerode theorem
In the theory of formal languages, the Myhill–Nerode theorem provides a necessary and sufficient condition for a language to be regular. The theorem is named for John Myhill and Anil Nerode, who proved it at the University of Chicago in 1958 ....
, whereby every two states of the same partition are equivalent if they have the same behavior for all the input sequences. That is, for every two states and that belong to the same equivalence class within the partition , it will be the case that for every input word , if one follows the transitions determined by from the two states and one will either be led to accepting states in both cases or be led to rejecting states in both cases; it should not be possible for to take to an accepting state and to a rejecting state or vice versa.
The following pseudocode
Pseudocode
In computer science and numerical computation, pseudocode is a compact and informal high-level description of the operating principle of a computer program or other algorithm. It uses the structural conventions of a programming language, but is intended for human reading rather than machine reading...
describes the algorithm:
The algorithm starts with a partition that is too coarse: every pair of states that are equivalent according to the Myhill–Nerode relation belong to the same set in the partition, but pairs that are inequivalent might also belong to the same set. It gradually refines the partition into a larger number of smaller sets, at each step splitting sets of states into pairs of subsets that are necessarily inequivalent.
The initial partition is a separation of the states into two subsets of states that clearly do not have the same behavior as each other: the accepting states and the rejecting states. The algorithm then repeatedly chooses a set from the current partition and an input symbol , and splits each of the sets of the partition into two (possibly empty) subsets: the subset of states that lead to on input symbol , and the subset of states that do not lead to . Since is already known to have different behavior than the other sets of the partition, the subsets that lead to also have different behavior than the subsets that do not lead to . When no more splits of this type can be found, the algorithm terminates.
The worst case
Worst Case
Worst Case is the 3rd book in the Michael Bennett series from James Patterson and Michael Ledwidge.-Plot summary:NYPD Detective Mike Bennett and his new partner FBI Special Agent Emily Parker are on the trail of Francis Mooney, a Manhattan trusts and estates lawyer with terminal lung cancer...
running time of this algorithm is , where is the number of states and is the size of the alphabet. This bound follows from the fact that, for each of the transitions of the automaton, the sets drawn from that contain the target state of the transition have sizes that decrease relative to each other by a factor of two or more, so each transition participates in of the splitting steps in the algorithm. The partition refinement
Partition refinement
In the design of algorithms, partition refinement is a technique for representing a partition of a set as a data structure that allows the partition to be refined by splitting its sets into a larger number of smaller sets. In that sense it is dual to the union-find data structure, which also...
data structure allows each splitting step to be performed in time proportional to the number of transitions that participate in it. This remains the most efficient algorithm known for solving the problem, and for certain distributions of inputs its average-case complexity
Average-case complexity
Average-case complexity is a subfield of computational complexity theory that studies the complexity of algorithms on random inputs.The study of average-case complexity has applications in the theory of cryptography....
is even better, .
Once Hopcroft's algorithm has been used to group the states of the input DFA into equivalence classes, the minimum DFA can be constructed by forming one state for each equivalence class. If is a set of states in , is a state in , and is an input character, then the transition in the minimum DFA from the state for , on input , goes to the set containing the state that the input automaton would go to from state on input . The initial state of the minimum DFA is the one containing the initial state of the input DFA, and the accepting states of the minimum DFA are the ones whose members are accepting states of the input DFA.
Moore's algorithm
Moore's algorithm for DFA minimization is due to . Like Hopcroft's algorithm, it maintains a partition that starts off separating the accepting from the rejecting states, and repeatedly refines the partition until no more refinements can be made. At each step, it replaces the current partition with the coarsest common refinement of partitions, one of which is the current one and the others are the preimages of the current partition under the transition functions for each of the input symbols. The algorithm terminates when this replacement does not change the current partition. Its worst-case time complexity is : each step of the algorithm may be performed in time using a variant of radix sortRadix sort
In computer science, radix sort is a non-comparative integer sorting algorithm that sorts data with integer keys by grouping keys by the individual digits which share the same significant position and value...
to reorder the states so that states in the same set of the new partition are consecutive in the ordering, and there are at most steps since each one but the last increases the number of sets in the partition. The instances of the DFA minimization problem that cause the worst-case behavior are the same as for Hopcroft's algorithm. The number of steps that the algorithm performs can be much smaller than , so on average (for constant ) its performance is or even depending on the random distribution on automata chosen to model the algorithm's average-case behavior.
Brzozowski's algorithm
As observed, reversing the edges of a DFA produces an NFA for the reversal of the original language, and converting this NFA to a DFA using the standard powerset constructionPowerset construction
In the theory of computation and Automata theory, the powerset construction or subset construction is a standard method for converting a nondeterministic finite automaton into a deterministic finite automaton which recognizes the same formal language...
(constructing only the reachable states of the converted DFA) leads to a minimal DFA for the same reversed language. Repeating this reversal operation a second time produces a minimal DFA for the original language. The worst-case complexity of Brzozowski's algorithm is exponential, as there are regular languages for which the minimal DFA of the reversal is exponentially larger than the minimal DFA of the language, but it frequently performs better than this worst case would suggest.