Boost logo

Boost :

Subject: Re: [boost] [safe_numerics] Last three days
From: Andrzej Krzemienski (akrzemi1_at_[hidden])
Date: 2017-03-11 13:55:38


2017-03-10 19:56 GMT+01:00 Robert Ramey via Boost <boost_at_[hidden]>:

> On 3/10/17 9:22 AM, John Maddock via Boost wrote:
>
>>
>> I'm very, very concerned that there are only a very few reviews
>>> (actually really just one !!!). In the past I've railed against the
>>> acceptance of libraries with only two reviews !!! I don't really know
>>> what else to say about this. I'll just punt to the review manager.
>>>
>>>
>> I think the problem is this: normally we review largely based on
>> interface and the design - get the design right and the internals
>> usually take care of themselves. However, in this case the design is
>> (hopefully) exceptionally uncontroversial - it looks like an int, smalls
>> like an int, and behaves like an int. There really isn't much to get
>> your teeth into there. What really matters is that:
>>
>> * It's functionally correct.
>> * It truly is a drop in replacement for type int, with no nasty surprises.
>> * It's performance compared to int isn't so dreadful that no one uses it.
>>
>> Unfortunately reviewing these points requires some exceptionally
>> detailed work: the internals of the library are sufficiently complex,
>> and use enough unfamiliar (to me at least) C++14 features, that this is
>> not an easy task.
>>
>
> This is a very believable explanation.
>
> I confess at present to be deeply surprised at how
>> complex the internals are...
>>
>
> If it makes any difference - it started out a lot simpler. Then it
> became apparent that the issue of performance could not be ignored in the
> real world. This meant detecting and filtering out redundant checking. On
> which I was stuck - until C++14 generalized constexpr which permitted
> implementation of compile time integer arithmetic. (which actually
> could/should be a separate library. This in turned motivated the checked
> integer routines to be constexpr. My interest and concerns for embedded
> systems and compile time usage of checked integer operations introduced
> constexpr check_result - basically a kind of monad one can use a compile
> time. All in all it's the composition of several libraries each of which
> could/should be a separate boost library. check_result, checked integer
> arithmetic, integer interval arithmetic. At the top is a layer which
> defines a safe_integer type and uses enable_if to overload all the binary
> and unary operations involving safe integers. The implementation of this
> last would be more concise using concepts-lite in C++20 or Paul Fultz's
> tick library. But neither of these are in boost or yet in the standard. All
> of the above is implemented via constexpr where possible.
>
> So as one can see it's actually pretty simple.
>
> It's only now that I've "finished" it, that I see how simple it is.
>
> Steven's exhaustive line by line review of the code is going to be a very
> tough act follow. This appeared the second day of the review. Maybe that
> intimated people - it would me. He's pointed out errors which I've agreed
> to fix. so it's not clear that repeating this process, though it woudn't
> hurt, might be somewhat redundant.
>
> Perhaps you might take a different tack. I've spent a little time looking
> at the boost multi-precision library with the eye of incorporating it into
> the safe numerics testing. In the course of doing this a couple of
> interesting things occurred to me.
>
> a) would safe numeric types inter-operate with safe numerics types? They
> should - but I haven't actually tested this. Whenever I fail to test
> something - there's almost always a bug in it.
>
> b) would safe<T> work if T is one of the types defined by boost
> multi-precision? This is unclear to me. safe numerics presumes that the
> largest types available are std::uintmax_t and std::intmax_t . It's easy
> to imagine altering this presumption to use types defined by the user to be
> types like boost::uint512_t or... . I think this would work with minor
> changes - such a combination would open up whole new territory.
>

FWIW, this is something different, but I tested safe<T> with
boost::rational, and at least a simple example works as expected:
https://github.com/robertramey/safe_numerics/issues/34
Maybe it is worht adding in tests, examples, docs.

> Maybe if you confined the scope of your review to issues such as the above
> you could save a lot of time and bring up issues that others are not in a
> position to do.
>
> Finally, my biggest disappointment in all this has been my inability to
> get people to take this whole effort seriously. That is, to even admit that
> there is even a problem. It's inexplicable and disheartening to me that
> one can write something like x + y and not be confident that the result
> actually equals x + y. Especially in light of the fact that we're using C++
> to make flying cars - not just websites. I feel that for the first time in
> 60 years we're in the position of making demonstrably correct software -
> and no one cares. It's very frustrating.
>

Maybe this is advertised incorrectly, or maybe there is a confusion about
the expectations from this library. If one sees `safe<int>` one might
think, "with this library using ints will be safer". But what does this
mean?

   1. I can use this library to *test* if my operations overflow? -- yes,
   it can do that
   2. I can use it like asserts: redefine its semantics in "Debug" and
   "Release" builds, by swapping the policies. -- yes, it can do it; but does
   documentation say about it?
   3. When I use this library, I no longer have to care about overflow
   bugs, because the library will take care for them for me (like with GC and
   memory management)? -- no, but is library clear about this?
   4. When I use this library, I will never have an overflow again? -- no,
   but will the users understand that when getting in touch with your library?
   5. Whan I use this library, I no longer have to check if I am dividing
   by zero? -- ok, the library will do th check, but that would actually
   reduce the quality, if you do not check for this yourself.

There exists a trend where a piece of software is considered "safe" or
"correct" only because every function has an artificially widened contract
and does an if-statement up front and throws an exception if its (hidden)
narrow cotract is violated.

Does your library take or promote this approach or not? -- One cannot
immediately see the answer from the documentation.

Regards,
&rzej;


Boost list run by bdawes at acm.org, gregod at cs.rpi.edu, cpdaniel at pacbell.net, john at johnmaddock.co.uk