From: Roland Richter (roland_at_[hidden])
Date: 2003-03-11 07:20:21
I need to split a string into tokens, and split the
resulting tokens again in a second pass. Currently,
I do this with boost::tokenizer initialized
with an escaped_list_separator.
The problem is that all the quote characters are
swallowed during the first pass, which makes
things rather ugly at the second time. One
workaround I thought of, namely to use two different
(sets of) quote characters, is inacceptable in our
case. Are there any other?
If not, it would be rather nice to have a switch
to tell escaped_list_separator whether it should
drop quotes or keep quotes.
Boost list run by bdawes at acm.org, gregod at cs.rpi.edu, cpdaniel at pacbell.net, john at johnmaddock.co.uk