Boost logo

Boost :

Subject: Re: [boost] [safe_numerics] One more review
From: Antony Polukhin (antoshkka_at_[hidden])
Date: 2017-03-11 19:02:49

2017-03-11 21:43 GMT+03:00 Robert Ramey via Boost <boost_at_[hidden]>:
> On 3/11/17 1:00 AM, Antony Polukhin via Boost wrote:
>> Nitpicks:
>> More generic streaming operators could be profitable (better to use
>> basic_ostream<Char, Trait> instead of ostream)
> I want to replace ostream with basic_stream<CharT, CharTraits> but I
> couldn't figure out the template definition syntax for using two separate
> sets of template parameters. Someone knows the answer off hand. Please save
> me some time and post here - after you're sure it compiles.

An example, that may help you:
All you have to do, is to list typenames from left to right in the
order they appear in function params. Something like this:

template<typename Char, typename CharTraits, typename T>
std::basic_ostream<Char, CharTriats>&
    operator<<( std::basic_ostream<Char, CharTraits>& os, const
boost::numeric::interval<T> & i)
    os << "[" << i.l << "," << i.u << "]";
    return os;

>> Many operations may be decorated with noexcept
> right - I'd be curious to know what happens if one uses noexcept and also
> uses -fno_exceptions in the command line of some compileers.

Works fine on all the compilers I know. To be more sure, you could use
BOOST_NOEXCEPT (this will also solve issues with some perverse
compilers that have C++14 features without some of the C++11 features
like noexcept).

>> There's a bunch of implicit conversions in the library
>> (BTW, have not found a body of this one). Would it be better to make
>> them explicit?
> Ahhh - no a very sore point. Implicit conversions between numeric type are
> supported by the library - just as they are for builtin integer types. This
> is to support the "drop-in" replacability goal. The difference is that
> these conversions are checked at compile time or execution time.
> tl;dr;
> This is actually a very subtle and difficult issue brought up by vicente.
> I'm going to paraphrase. Aren't we promoting bad practice by encouraging
> implicit conversions? Or Isn't the way to program correctness paved with
> strong types? I've a big promoter of this later view. But I confess that
> making the library has led me to conclude that the future of correct
> programming needs a more nuanced view. In order to promote correct code, we
> need to be able to express our ideas directly in code in a way that matches
> the way we use the ideas in real life. In real life we say x+y with no
> ambiguity. We can't express real numbers in computer hardware so x+y
> doesn't really capture what we want to say. requiring explicit conversions
> every where is going to make it harder to verify that our program matches
> our expectation. So (checked) implicit conversions are the correct
> approach. It maintains our ability to visually compare/verify that our code
> matches our intention - but it traps the times when our expectation can't be
> actually realized inside the computer hardware.
> I've become convinced that this is the way things have to be.

Two approaches could be mixed:
* for promotions and non-narrowing conversions an implicit conversion
could be used.
* narrowing conversions could be implemented as explicit conversions
with runtime check.

But that's just an idea that needs some investigation. However that
approach is more future proof. If later you decide to make all the
conversions implicit - this will not break users code. With current
approach - making the conversions explicit will break users code.

Best regards,
Antony Polukhin

Boost list run by bdawes at, gregod at, cpdaniel at, john at