Boost logo

Boost :

From: todor_at_[hidden]
Date: 2001-10-20 05:43:34

From: <jbandela_at_[hidden]>

> I have posted a test version of tokenizer to
> in the files section.

Why do you need this BogusIterator<>/is_valid()/set_invalid()
What's wrong with simply comparing cur() with get_base_end()?

Also, you should not make the assumption that default constructed
is invalid. Consider the following fragment from CSV file:
  12, ""
With token_type == std::string this should produce two tokens:
  std::string("12") followed by std::string("")


Boost list run by bdawes at, gregod at, cpdaniel at, john at