|
Boost : |
From: todor_at_[hidden]
Date: 2001-10-20 05:43:34
From: <jbandela_at_[hidden]>
> I have posted a test version of tokenizer to tokenizer_optimize.zip
> in the files section.
Why do you need this BogusIterator<>/is_valid()/set_invalid()
construct?
What's wrong with simply comparing cur() with get_base_end()?
Also, you should not make the assumption that default constructed
token
is invalid. Consider the following fragment from CSV file:
12, ""
With token_type == std::string this should produce two tokens:
std::string("12") followed by std::string("")
Todor
Boost list run by bdawes at acm.org, gregod at cs.rpi.edu, cpdaniel at pacbell.net, john at johnmaddock.co.uk