ResultsHow the data was derivedThe tables below are each derived from 100 runs of the Learner. Abbreviations used in table headings is expanded as follows.
Expression
When learning Expression, the number of equivalence queries is reduced by 82.5% when counterexamples are reused, and by 87.5% when counterexamples are also saved. After comparing these reduction rates with those of other test languages, we can establish that Expression is more susceptible to both extensions of the learner than any other test language. An explanation of why this is the case is attempted in the presentation of the tree language Expression. Typed and Untyped Formula
By comparing, in the results for (Typed) Formula and Random Monadic, the number of states and transitions found by reusing counterexamples, with the number of states and transitions found by saving counterexamples, resp., we observe an interesting effect: To some extent, saving counterexamples diminishes the benefit of reusing them. This can be explained as follows. If a new state is discovered, the reusing learner is likely to find some of the corresponding transitions "by coincidence" when examining one of the next counterexamples (rather than the present one). In contrast, the saving learner may discover the same transitions prior to examining the next counterexamples, even if it needs them in any case. Interestingly, the number of equivalence queries made when learning Simple English in the reusing mode is much smaller than the corresponding number in plain mode, whereas absolutely nothing is won by saving counterexamples in this case. As expected, we can observe that the number of equivalence queries made when learning Formula in save mode is less then the corresponding number in reuse mode. However, the reduction of approximately 5.4% is unexpectedly small. As the tables show, this situation changes if we look at Expressions and Random Monadic. In these cases, both reusing and saving counterexamples lead to a considerable reduction of the number of equivalence queries. See also the presentations of tree languages Untyped Formula and Formula. Tall Monadic
The first automaton that the Learner synthesises recognises only the empty language. For this reason, the teacher can only respond by returning the single tree in Tall Monadic, but this tree contains (subtrees representing) every state and transition needed to recognise Tall Monadic. Hence, if the counterexample is reused, then only one more is necessary, namely null which signals that the target language is achieved. Since the Learner terminates when receiving this second counterexample, it will never revisit a saved counterexample, and this explains why columns SQ and ST are always empty or zero. Random Monadic
Recall that to recognise Random Monadic, at most 40 states and 39 transitions are needed. Now, to infer Random Monadic, the Learner uses approximately 52 equivalence queries, while the Reuser manages with only 14 and the Saver as little as 8. The reason for the large speed-up is that all counterexamples contains a fair amount of information, so much can be gained by reusing and/or saving them. In fact, every counterexample of height h contains (trees representing) exactly min(h,h') states and min(h,h')-1 transitions, where h' is the maximum height of the trees in Random Monadic. Simple English
Apart from Tall Monadic, Simple English is the only language where saving counterexamples does not reduce the number of equivalence queries stated. One possible explanation is that trees t and t' that represents distinct constituents, will seldom, if ever, have a shared context c such that both c[t] and c[t'] belong to Simple English. In other words, the Learner is not likely to overlook a piece information, so there is little to gain by returning to an already processed counterexample, as chances are that it is already drained of all information. Contexts
Note that in infering the tree language Contexts, the learner needs only to synthesise just below five automata, on average. The first automaton is created from the initial observation table, and this cannot be avoided, since the learner must guess the target language although it knows nothing about it, in order to provoke a counterexample from the teacher. The remaining four automatons correspond to the finding of the four states needed to recognise Contexts. |