Boost logo

Boost :

Subject: Re: [boost] Boost.HTTPKit, a new library from the makers of Beast!
From: Vinícius dos Santos Oliveira (vini.ipsmaker_at_[hidden])
Date: 2018-01-01 17:49:43

2017-10-12 5:38 GMT-03:00 Seth via Boost <boost_at_[hidden]>:

> On 12-10-17 06:47, Vinícius dos Santos Oliveira via Boost wrote:
> > What do you think about the following links?
> What is the relevance of the links? They're extremely broad and general.
> If you are suggesting that the implementation of the parser interface
> should use parser combinators/generators, for sure. That is not
> necessarily an interface design concern.

I coded a new example for you. It took me some time because I wanted to
code other features. There was no serious test for the response parser
(aside from the test suite), so this was one of the things I've coded.

Anyway, in the tutorial, I've mentioned the problem of composability, a
chain of transformation. This is an architecture that resembles GStreamer
filter elements. It also remember iterators/ranges. It's also an
architecture which resembles the "is to base our macros on tokens rather
than AST nodes" decision from recent Rust developments<>.
It's _not_ a toy, it's a popular solution. I'm not _forcing_ this design
(you can use the parser ignoring this possibility). I'm merely pointing
that this design might be useful to the user and one of the models is
prohibitive on such option. Here you can see how you can wrap the parser to
force "atomic header fields" and just ignore the possibility where these
tokens would be delivered separately (you could do this for any tokens):

Pretty powerful change. In the same example, you still can provide your own
wrappers and have yet another transformation happening behind the scenes.
I've coded an example:

I've listed a few applications in the top comment of the example. They are
not artificial possibilities. Some were based on user comments (e.g. I want
to only store a specific set of headers, I want to reject requests
sooner...). The list could go on and on, so it's useless to try to guess
what the user would want. This concept is incredibly powerful and by just
allowing my higher layer to have the parser customized, I've got plenty of
possibilities. The same would happen to _anyone_ using this parser.

And if you ignore completely what higher interface the user is interested
in designing... the parser is easy to use. And the parser just got easier
with the changes I've pushed a few moments ago (and they were predicted).

You ask me what is the relevance of these links. In this discussion, I had
to repeat information over and over using different strategies. At least, I
guess you won't ask the relevance of the two other links, which are direct
examples on how to design parsers.

>From my point of view, it was boring that the relevance was even missed to
start with. Let's do this, I won't give hints of how the future should look
like, so neophobia shouldn't attack anyone (but neophobia is a bad term
given I'm only talking about old concepts). I'll come back next month with
another nice thing.

What _specific_ interface design concerns do you have in mind when
> linking these kind of general purpose libraries/approaches? Are you
> proposing a parser combinator library for Boost or the standard? (Would
> Spirit X3 fit the bill?).

For now? These:

Once we change the road, I can proceed to discuss how the pull parser
should look like.

On 12-10-17 06:47, Vinícius dos Santos Oliveira via Boost wrote:
> > I took the liberty to convert the Tufão project that you've
> > mentioned to use the Boost.Beast parser:
> >
> 1b4aea377f993592bc2c0d77
> That is nice and tangible. Let's focus on concrete shortcomings,
> relevant to the library interface.

What is the whole tutorial I've written? Full of “design implication”


Vinícius dos Santos Oliveira

Boost list run by bdawes at, gregod at, cpdaniel at, john at