Date: 2001-10-20 05:43:34
> I have posted a test version of tokenizer to tokenizer_optimize.zip
> in the files section.
Why do you need this BogusIterator<>/is_valid()/set_invalid()
What's wrong with simply comparing cur() with get_base_end()?
Also, you should not make the assumption that default constructed
is invalid. Consider the following fragment from CSV file:
With token_type == std::string this should produce two tokens:
std::string("12") followed by std::string("")
Boost list run by bdawes at acm.org, gregod at cs.rpi.edu, cpdaniel at pacbell.net, john at johnmaddock.co.uk