Boost logo

Boost :

Subject: Re: [boost] Formal Review: Boost.Polygon starts today August 24, 2009
From: Brandon Kohn (blkohn_at_[hidden])
Date: 2009-09-02 12:26:35


- What is your evaluation of the design?

The overall design (concept refinement) seems to consistently describe
the most useful specializations of polygons in a reasonable way.

Some of the elements such as isotropy seem to be more biased towards the
45/90 specializations of polygons and might be confusing. For example
when I think of orientation_2d, my intuition would be the the states are
referring to the orientation of a point to a line, plane, etc. (i.e.
left, right, collinear). After reading a bit more I think that what I
call 'orientation', is what GTL calls 'direction'. I'm not quite sure
what are the uses of the orientation_xd type/values (at least not in
general.) It's certainly sensible to quantify the topological
relationships between geometries, and I can appreciate that this is the
goal here. Perhaps some examples of common use cases for each would help?

Otherwise, the overall design of the library seems to be pretty good.

- What is your evaluation of the implementation?

I think that it would be good to have a ready made general polygon with
floating point coordinates. Here my concerns are not so much about
precision as they are with overflow. In my work I routinely deal with
calculations that would tend to overflow in 32 bit integral math. I'm
not certain of the affects of upgrading them to use 64 bit (it's better
of course, but still has less range than say a 64 bit float etc.) I'm
sure there are plenty of good reasons to argue for making users convert
to integral types. However, it's really ignoring the 800 lb. gorilla
when you consider the amount of geometry legacy code we have using
floating point. If users have to convert all their FP polygons to
integral types and then back, the performance cost alone could be
prohibitive. I know that when I started looking for a boolean op library
a few years ago.. PolyBoolean was one I investigated, but given the
conversion costs to its integral type grid, we dismissed it and settled
for the CSG operations bundled with OpenGL. My point here is that if
there is any way that floating point can be used with reasonably robust
results, then it should be wholly supported. Perhaps it already is, but
because I encountered a compile error when I changed:

typedef gtl::polygon_data<int> Polygon;
to:
typedef gtl::polygon_data<double> Polygon;

I can't be sure. If I had had more time to review I would have tried
with the custom option.

Another thing I noticed was the algorithm for building a connectivity
graph/polygon merge. It seems this would be best done if it adhered to
BGL concepts for the graph interface. If I'm going to build a graph, I
would certainly want to be able to do post-processing on it using the
BGL algorithms. I'm going to take a wild guess that the graphs from GTL
are used to represent a doubly connected edge list (or DCEL). For those
who don't know this is like a planar map with bi-directional edges. I
noticed that BGL now has one of these (planar map though undirected) and
could probably do with a DCEL implementation. Having this coupled with
the solid boolean algorithms in GTL would be an incredibly useful
combination.

- What is your evaluation of the documentation?

The format is good. It generally contains the information I needed to
learn what things mean. I think a tutorial would go a very long way to
gaining general acceptance.

- What is your evaluation of the potential usefulness of the library?

Immense (if the floating point version is easy to use and not to slow.)
It still has a lot of value even for integral use. At the end of the
day, if you can deal with the conversion, getting robust output from a
map overlay is quite an achievement (IMO.)

- Did you try to use the library? With what compiler? Did you have any
problems?

I toyed with a few examples. MSVC9. I only had problems when I tried to
use double as the coordinate type of polygon_data<T>.

- How much effort did you put into your evaluation? A glance? A quick
reading? In-depth study?

A quick reading best characterizes my review (due to time constraints...)

- Are you knowledgeable about the problem domain?

Yes.

My vote would be to accept the library into boost with the following
provisions:

1) Make a stock floating point general polygon type available if
possible for the most common floating point types (like std::string). My
assumption here is that this mode of work is viable using the lib as-is.
If it turns out not to be robust enough to recommend, at least these
lessons will have been learned and documented.

2) (Assuming 1) Investigate and document some of the most common
pitfalls when using floating point types (and/or converting to/from
integral types).

3) Create a tutorial illustrating reasonable use-cases for the library
(I find these to be the most useful of all the docs in other libs.. at
least until I already know the basics :D).

Good job on getting this together Luke (and co.)

Best of luck,

Brandon


Boost list run by bdawes at acm.org, gregod at cs.rpi.edu, cpdaniel at pacbell.net, john at johnmaddock.co.uk