Subject: Re: [boost] [geometry] robustness approaches
From: Simonson, Lucanus J (lucanus.j.simonson_at_[hidden])
Date: 2009-03-13 18:22:48
Fernando Cacciola wrote:
> The library proposed by Lucannus, OTOH, at least acknowledges the
> problem appropriately and approaches a solution in the right direction
> (there is still some important design-influencing considerations but
> are not showstopers)
Thank you, Fernando. I agree with you 100% that robustness needs be designed in up front and not tacked onto an existing geometry code base as an afterthought. In particular, the robustness strategy needs to be comprehensive and not patchwork. I acknowledge that my library needs a lot of work in this area to make it robust for both integer and floating point, particularly since it was only designed with integer robustness in mind. I will have to redesign the low level stuff quite significantly to get both integer and floating point fully robust and sharing the same algorithm. I don't really have the knowledge I need to make my library floating point robust. I need to read some of the papers you mention.
Making the coordinate data type a template parameter and "allowing" the user to specify infinite precision numerical data type if they want robustness is not a solution, because there are very simple and effective ways to make integer and floating point operations robust. I don't know if we need to design in points of customization to expose the ability to do that to the user, but it needs to be at least possible for the library developer to specify different strategies for different coordinate data types and different contexts.
I think Brandon's library is trying to handle both integer and floating point robustness, and I've noticed he is at least as interested in multi-precision numerical data types as I am, which is a good sign that he's doing the right things. I'm interested in learning more about his approach.
Performance is fine as a goal, but if the algorithms fail (crash, hang, drop polygons or "gracefully" throw an exception) even a small percentage of the time due to robustness issues then they are worthless, regardless of how fast. If we tried to use a non-robust library and it failed in a mission critical situation it costs us over one million dollars a day to fix the problem. Failure is not an option. When I say 100% integer robust I'm serious. Its all or nothing and the stakes for me are high.
Boost list run by bdawes at acm.org, gregod at cs.rpi.edu, cpdaniel at pacbell.net, john at johnmaddock.co.uk