From: joel de guzman (isis-tech_at_[hidden])
Date: 2001-06-08 21:41:40
----- Original Message -----
From: "David Abrahams" :
> ----- Original Message -----
> From: "joel de guzman" <isis-tech_at_[hidden]>
> > Some questions:
> > 1. Why is compilation times bounded by lexical analysis?
> Sheer number of tokens to be processed. That is why most compilers can't
> afford to use table-generated lexers and end up using hand-crafted code
> Well, at least until template instantiation came along as a compilation
> this was true ;-)
So lexers are basically of the form: t1 | t2 | ..... tn
in a loop while skipping white spaces? Indeed that is slow;
a lexer cannot predict what will come next from the
input stream without knowledge of the grammar.
It is basically understood that lexers speed up parsing.
Shouldn't we do some rethinking now?
Boost list run by bdawes at acm.org, gregod at cs.rpi.edu, cpdaniel at pacbell.net, john at johnmaddock.co.uk