Subject: Re: [boost] [safe_numerics] Formal review starts today
From: Robert Ramey (ramey_at_[hidden])
Date: 2017-03-09 18:31:59
On 3/9/17 7:23 AM, Paul A. Bristow via Boost wrote:
>> Here are some questions you might want to answer in your review:
>> - What is your evaluation of the design?
> Complicated to use (compared to int),
Hmmm - a major design goal is that it be a no-brainer to use. I'd like
to see you exand a little on this comment.
and very, very complicated to write.
LOL - it's worse that it looks. There are a lot of difficulties with
something like this. One is that its hard to stick to the subject and
keep in mind what one's idea of the original purpose of the library is.
But this is common in almost all library efforts. A major difficulty in
an elaborate TMP project is that the compiler does't really support it
very well. Error messages are often unrelated to the mistake and very
unhelpful. There is not convenient way to display messages which
display type information at compile time. It would be useful to have a
BOOST_STATIC_WARNING that works. These problems can only really be
addressed by enhancements to the compiler itself. The fact that none of
these have been proposed suggests to me that TMP is a lot like teenage
sex - there's a lot more talk about it than is actually going on.
> But that is a consequence of the daft design of the C language.
I want to defend the C language design. My first exposure to C was in
1980? I was introduced to Ratfor which was a fortran implemented C like
language introduced in the Plauger book "Software Tools". After that I
lusted after a C language tool. The machine I had at my disposal was a
Data General Eclipse - a 16 bit architecture. I got a hold source code
for a compiler for the Z80 and managed to port it to the DG machine.
Then I managed to compile the standard source which my compiler. I was
hooked! The C language then was the world's first "portable assembler".
This was huge at the time. BTW - it can (and often is) used in this
The "problem" is that evolved the orignal C language into something
entirely different. - A system for describing and implementing higher
order ideas in an abstract and portable manner. It was done through
evolution. Dozens (hundreds) of attempts to build something like that
from scratch (fortran, cobol, pascal, modula, Ada) all failed for some
reason or another. I believe in evolution. I believe it is the source
of progress and success in all things - but it always leaves me
disappointed and wanting something more perfect - which I can never get.
> The language also doesn't allow use of hardware to handle overflow (and underflow)
When I started this project I was thinking that things would be better
if hardware handled these things. But now that I look what I've
achieved, I think I was wrong. A key feature of this library is that it
allows one to select how to handle these and other similar situations.
Currently one can detect and handle these at runtime, or detect
potential problems and fail a compile time, promote types to larger
types to avoid the problems - (Thereby "fixing" the original "daft"
design of the C language - and likely creating your own). Perhaps
others ideas are possible. So have been mentioned - "saturation" for
and doesn't have arrays etc as first class citizens.
I don't agree with this - but arguing about 30 year old decisions is a
sign of old age.
> I don't believe that it is really a idiot-proof drop-in replacement for the built-in integral types,
Well, that is what I've intended and I think I've mostly achieved it.
but it will be fine to be used for new projects, especially as it needed
the fancy features of C++14.
One of the major use cases I see for this is finding errors in legacy
code. I included a small example which touches on this. I actually
have a larger example - the code is in there. But since its a real (and
realistic example - stepping motor controller running on an 8 bit PIC
microcontroller) it was a little more complex than normal and I didn't
have time to finish the narrative which goes with the example. I hope
get back to it in the future.
I have done multiple projects with such tiny micro controllers. One big
issue with these is the development environment which doesn't lend
itself to unit testing. Mostly this is addressed just by skipping unit
testing - using the burn (the chip) and crash (the program - but these
days it might mean the car itself). But I like to factor out critical
pieces into modules which can compiled and unit tested. An can create
test vectors which run all the test cases and end points. When a unit
test fails I can step through with the debugger. The cpp policy
implements safe types and checking with the correct sizes for the
microcontoller. I can compile and debug the tests in an environment
which closely tracks the target one.
By using the compile time trap as the exception policy I can also
detect potential failures and tweak my code so that they can never
happen. Then I can build and run on the target environment with much
more confidence that there are no bugs.
To me this is absolutly huge. But it's a very, very, very tough sell.
People who work in the microcontroller world see C as portable
assembler. People who work in C++/Boost world see C++ as a way of
addressing general problems in a more abstract (mathematical like way).
I'm hoping to bridge two worlds here. I'm not hopeful. I'm a sucker
for lost causes.
> I agree that explicit conversions are the Right Thing,
I used to believe that. Now my view is "not always"
but they do carry a cost to the users -
the need to understand the issues and take care with construction and
assignment because this is a User Defined Type (UDT)
and the special-case privileges of built-in do not apply (another daft
Floating-point and fixed-point UDT have proved unintuitive to use
because of explicit conversions;
there are big elephant traps awaiting the unwary.
Right - I'm thinking that that library traps at compile time when one
does the following. I'll have to recheck this.
i = 1.0
i = static_cast<int>(1.0)
> Costly at compile time because of the number of libraries included, but that's a cost worth paying.
and I don't think it's all the costly. I'm using a 3 year old mac mini
with Xcode/clang for development and all tests compile in a few seconds.
Not a big deal in my opinion.
> I like that the infrastructure might be extended to other than integral types.
>> - What is your evaluation of the implementation?
> Above my pay grade.
LOL - that makes two of us. That's why we have obi-wan-watanbi
>> - What is your evaluation of the documentation?
> Good exposition of the problems and solutions.
> Good examples.
> Links to source of examples code would be *very* useful.
easy one - I'll do this. I'll also add the output from the examples
which is another idea that has been mentioned.
Starting with an example is the most common way of 'getting started'.
> Will using "" file specification
> #include "../include/safe_range.hpp"
> instead of <>
> #include <../include/safe_range.hpp>
> cause trouble for users trying to use/modify the examples in Boost or their own folder position?
The version being reviewed in the incubator is meant to be built and
tested outside the boost infrastructure. The boost testing infracture
supports the creating of the test matrix and centralized testing. But
in opinion it is an impeditment to one who just wants to test a single
library not already included in boost.
Boost libraries sort of "presume" the centralized setup promoted by b2
headers. I believe that this is also an obstacle to
mixing/matching/testing other libraries.
In any case, if this library is accepted, include file structure will
have to evolve to the boost way of doing things. Not a big issue or
> The common need for overflow (or underflow) to become 'saturation' == max_value (or min_value) is not an option (but then it is arithmetically 'wrong', I suppose ;-))
Right - This has come up. But I declined to address this for a few reasons.
a) Things are already really complex - I'm at my brain's limit
b) I don't want to pollute the overriding idea:
"This library provides a method for the brain dead to guarantee that his
program will not produce incorrect arithmetic results."
Adding more "features" dilutes the conceptual power of thee library,
This makes it impossible to describe what it does in one sentence.
>> - What is your evaluation of the potential usefulness of the library?
> Very valuable to make programs that 'always work' instead of 'mostly work'.
Ahhh - this is the thing for me. Imaging how your going to feel when
your microcontroller is included in a product which ships 10000 units
and a bug is discovered? How are you going to feel if that product is
an automobile accelerator or Mars landing probe? How, after 50 years of
computer systems and computer language development, arrived at this
situation. I'm mystified and disheartened.
> Users might appreciate a list of compiler versions that really do work.
> Sadly 'Compliant C++14' is a bit fuzzy. (Should become clearer if accepted and test matrix visible).
LOL - I try to follow the standard - but it's not really possible to
even read it. I rely on testing - known limitations. But it's the best
we have. So far it's worked pretty well. I can build on Clang with no
warnings and on recent versions of GCC and produce the exact same
results. I would love to test on VC but I don't have such a machine.
I've tried to set up appveyor.yml to do this but have been unsuccessful.
I'm interested if someone want's to look into this.
> A good effort at working round fundamental C language defects.
a good effort - a ringing endorsement ! It's ok one of my squash
partners - a female - says I'm a "good sport". doesn't bug me though.
>> - Did you try to use the library? With what compiler? Did you have any problems?
> Had a very brief try with VS 2015 with /std:c++latest added to command line (to try to ensure C++14 compliance) but am stuck with
I'd be curious to see the example you tried. Did you try to run the
test suite and/or examples?
But this re-enforce the suggestion that I should put the concept
checking - such as it is - into the library code to help detect usage
mis-understandings. I'll look into this.
> 1>j:\cpp\safe_numeric\safe_numeric\include\interval.hpp(107): error C2737: 'boost::numeric::`anonymous-namespace'::failed_result': 'constexpr' object must be initialized
> 1>j:\cpp\safe_numeric\safe_numeric\include\safe_base_operations.hpp(131): error C2244: 'boost::numeric::safe_base<T,Min,Max,P,E>::operator =': unable to match function definition to an existing declaration
> But I am sure that this is entirely my fault, but I'm out of my time allotted to this review.
I don't think it's your fault. I think that's an indication that I've
fallen short of my goal that the library be idiot proof. I shouldn't
have to say this but this not mean that I think your an idiot, but that
if it's not simple for you, it's not going to be simple for an idiot -
which is my goal.
This re-enforces the suggestion that I should put the concept checking -
such as it is - into the library code to help detect usage
mis-understandings. I'm look into this.
> Also - Is this warning(s) avoidable/relevant/quietable?
> j:\cpp\safe_numeric\safe_numeric\include\safe_base.hpp(233): warning C4814: 'boost::numeric::safe_base<T,Min,Max,P,E>::operator =': in C++14 'constexpr' will not imply 'const'; consider explicitly specifying 'const'
This looks very easy to fix - but my compiler doesn't emit these
warnings. Maybe it would with some switch. I'll look in to it.
> I feel that all warnings should be avoided or supressed using push'n'pop pragmas.
My goal is to closely stick to C++14. So I shouldn't have any warnings.
I believe this is possible with some minor code tweaking.
Thanks for your comments. They are very, very helpful.
> Paul A. Bristow
> Prizet Farmhouse
> Kendal UK LA8 8AB
> +44 (0) 1539 561830
> Unsubscribe & other changes: http://lists.boost.org/mailman/listinfo.cgi/boost
Boost list run by bdawes at acm.org, gregod at cs.rpi.edu, cpdaniel at pacbell.net, john at johnmaddock.co.uk