Boost logo

Boost :

From: Douglas Gregor (gregod_at_[hidden])
Date: 2001-05-29 17:01:46

On Tuesday 29 May 2001 12:35 pm, you wrote:
> Douglas Gregor wrote:
> > I mentioned the Acceptor interface because most automata are presented as
> > acceptors: DFAs [..] accept or reject a given string
> But while this is theoretically interesting, it is not
> as useful in practice as a lexer. A maximal munch lexer is
> an easy modification to a DFA. It does two extra things:
> first, it multiple Accept symbols (one for each Token kind),
> and second, it recognizes and returns a _prefix_ of a string,
> not a whole string

A lexer can be viewed as an acceptor that emits tokens along the way. A lexer
can be built as a DFA with some extra data at each state.

> > We probably don't want the user to have to subclass Terminal or
> > Nonterminal merely for classification purposes - it would get in the way
> > of allowing arbitrary types (i.e., char or std::string) as terminals. A
> > trait and/or policy class would likely suffice.
> What has subclassing got to do with it??

Absolutely nothing. It was used in the interface that was originally posted
as a means of classifying objects as terminals or nontermals. It can be a way
to distinguish them, but it would be best to use a static method (traits
and/or policies) to distinguish the two and therefore allow optimization (but
not exclude allowing the user to classify terminals and nonterminals

> There is No OO here. Terminal and Nonterminal are concrete
> template argument types.

The user can inject OO here.

> Actually, there is a problem, because C++ doesn't support
> discriminated unions: a grammar production is a list of Symbols,
> and a Symbol is either a Terminal or a NonTerminal. This is very
> hard to represent correctly in C++.

Expression templates do this. Alternatively, we should be open to even more
dynamic grammars (i.e., created a run-time) that cannot benefit from the
expression template encoding.


Boost list run by bdawes at, gregod at, cpdaniel at, john at