Boost logo

Boost :

Subject: Re: [boost] [Beast] Questions Before Review
From: Vinnie Falco (vinnie.falco_at_[hidden])
Date: 2017-06-26 15:46:43


On Mon, Jun 26, 2017 at 7:47 AM, Niall Douglas via Boost
<boost_at_[hidden]> wrote:
> If you have a severe algorithmic flaw in your implementation, reviewers
> would be right to reject your library.

If you are stating that Beast has a "severe algorithmic flaw" then
please back up your claims with more than opinion. However, note the
following:

* At the time Boost.Http was reviewed it used the NodeJS parser which
operates in chunks [1]. No "severe algorithmic flaw" came up then.

* PicoHTTPParser, which Beast's parser is based on, outperforms NodeJS
by over 600% [2]

* For parsers operating on discontiguous buffers, structured elements
such as request-target, field names, and field values must be
flattened (linearized) to be presented to the next layer which means
temporary storage and buffer copying [3, 4]. So buffer copies cannot
be avoided. Beast makes the decision to do one big buffer copy up
front instead of many small buffer copies as it goes. The evidence
shows this tradeoff is advantageous.

But maybe you are suggesting that functions like basic_fields::insert
should take as their first parameter `gsl::span<gsl::span<char>>`
instead of `string_view` [5]? That would be quite inconvenient.

[1] https://github.com/nodejs/http-parser

[2] https://github.com/fukamachi/fast-http/tree/6b9110347c7a3407310c08979aefd65078518478

[3] https://github.com/vinniefalco/Beast/blob/8982e14aa65b9922ac5a00e5a5196a08dfa8f29e/test/benchmarks/nodejs_parser.hpp#L664

[4] "In case you parse HTTP message in chunks...your data callbacks
may be called more than once"
https://github.com/nodejs/http-parser/blob/master/README.md#callbacks

[5] https://github.com/vinniefalco/Beast/blob/8982e14aa65b9922ac5a00e5a5196a08dfa8f29e/include/beast/http/fields.hpp#L365


Boost list run by bdawes at acm.org, gregod at cs.rpi.edu, cpdaniel at pacbell.net, john at johnmaddock.co.uk