|
Boost : |
Subject: Re: [boost] [geometry] area, length, centroid, behavior
From: Simonson, Lucanus J (lucanus.j.simonson_at_[hidden])
Date: 2009-03-09 13:34:51
Barend Gehrels wrote:
> Hi Luke,
>> I'm still confused. Why isn't scanline suitable for floating point?
>> Isn't generalize line intersection with floating point generally
>> implemented as a scanline patterned after Bentley-Ottmann?
> OK, you mean the sweep line. Yes, that is suitable for floating point,
> sure. So we agree.
Is scanline the wrong term? I've always thought it was interchangable with sweepline, plane sweep, line scan and line sweep. If it is associated with some particular polygon fill implementation or some such and sweep line is the general term I'd be glad to know of it.
>
>> If your union algorithm doesn't work by scanline and doesn't work by
>> rule-based graph traversal how does it work? These are the only two
>> methods I've heard of.
> It uses Weiler Atherton (http://tinyurl.com/ajnqso), but right, that
> is
> a form of graph traversal (but they don't call it that way). So here
> also we agree.
>
Weiler Atherton is an example of what I meant by rule-based graph traversal. You have a visitor and he runs around a graph where polygon-vertices are graph-nodes and polygon-edges are graph-edges and tries to walk the outer boundaries of polygons. This algorithm is really a pain to implement because you have to anticipate all the unfortunate circumstances your visitor will find himself in and write rules to get him out of them correctly. Frequent mistakes that are made is joining together of polygons that touch at only one vertex, and of course getting the wrong answer. Another drawback is that while the visitor can find the outer shells and the holes, he can't find which holes belong to which shells, meaning an additional algorithm is needed for that. Finally, the algorithm is completely inflexible in the sense that to implement a new operation you need to write a complex set of rules, which is a non-trivial ammount of work. This leads to lazy short cuts like using a combination of AND and SUBTRACT (AND NOT) to implement XOR, which could be done as a single operation if it were easier to write the rules. The algorithm is difficult to validate because you can never convince yourself that it has seen all the cases you didn't think about when writing the rules. Aside from being difficult to implement and validate the algorithm is very expensive because the graph data structure is heavy in memory and slow to traverse due to pointer chasing and cache being poorly utilized (neighbor nodes are not adjacent in memory). Unless you need the graph data structure for some other reason (such as meshing) you are probably happier to forego using it at all.
Given a black box booleans implementation I can usually tell if it is graph traversal based because it is a) slow, b) uses too much memory and c) has O(n^2) runtime associating n holes to an outer polygon because a shortcut is usually taken when taking care of that detail. When compared to an alternative that doesn't suffer these limitations it is at a competitive disadvantage.
Please note, Barend, I'm not criticizing you, I'm criticizing an algorithm invented in the '70s. It is an ongoing frustration of mine that software tools we have to use were implemented with this algorithm. I'd really like to see a superior open source alternative to it plugged into those tools and improve them. That's sort of my goal, actually.
Regards,
Luke
Boost list run by bdawes at acm.org, gregod at cs.rpi.edu, cpdaniel at pacbell.net, john at johnmaddock.co.uk