|
Boost Users : |
Subject: Re: [Boost-users] [Polygon] Test among 3 libraries
From: Simonson, Lucanus J (lucanus.j.simonson_at_[hidden])
Date: 2010-09-10 01:32:08
Angus Johnson wrote:
>> Just a C++ style pointer, we have what we call resource acquisition
>> is initialization (RAII) policy where we initialize all variables at
>> the time they are declared rather than deferring that to later. The
>> rationale for this is that it avoids bugs introduced later when a
>> variable is attempted to be used between where it was declared and
>> where it is initialized resulting in undefined behavior and
>> different behavior on different platforms/compiler optimization
>> flags.
>
> Thanks. I'll try and remember that. Delphi enforces local variable
> declarations before any logical operations can be performed (IIRC to
> allow a one pass compile), so I'm going to have to break a long
> standing habit.
>
>> TPolygon result; //it would be better if the TPolygon constructor
>> accepted the size
>
> As you can see I'm still coming to grips with the fundamentals of
> C++. I forgot that the std vector's constructor accepted a size
> parameter.
>
>> y = y + (j % 2)*2 -1; //squiggles right twices as fast as it
>> squiggles up
>
> That was so I could see it was really doing what I expected, otherwise
> it just looked like a fat blurry line :).
>
>> This will be the worst case input and have expected runtime
>> complexity of O(n^2) using a linked list as the sweepline data
>> structure while an optimal Vatti using a tree would have O(n log n)
>> complexity and empirical scale factor of about O(n^1.05).
>
> I've given my data structure a little more thought and believe it
> isn't a simple link-list structure as I stated earlier. The main
> linked list, the active edge list (AEL) contains a list of active
> edges which also link to their adjacent edges in an edge bound (LML).
> I imagine that this is the tree structure you been mentioning.
It doesn't sound like a tree. I don't quite grasp what you mean by adjacent edges in an edge bound. The empirical scaling factor of an optimal sweepline that uses a tree should be about n^1.05 for all test cases. That's what we see with CGAL, for example. I use the std::map (internally a red-black tree) as the sweep line data structure to maintain a sorted sequence of line segments that intersect the sweepline. I can lookup, insert and remove line segments from the map in O(log n) (amortized) runtime. Each line segment end point scanned during the progress of the sweepline gives rise to a constant factor number of lookup, insertion and removals from the sweepline. There are no more than O(n) elements in the map at any given time and there are O(n) line segment end points to process. Intersection events add to the number of events to be processed but not the max number of elements in the map. Therefore the runtime complexity is O((n+s) log n) for n line segments and s intersection points (Vatti's reported runtime complexity.) I usually simplfy to O(n log n) wrt line segments plus intersections because the number of intersections is at worst n^2 and log(n^2) = 2 log(n) so differs by only a constant factor. If the sweepline data structure requires linear lookup then you have O((n + s)n) runtime complexity. If the number of unique x values is smaller than O(n) then you could get a significant optimization with the list data structure by keeping your place in the sweepline data structure from the previous y value and a runtime complexity of O((n + s)x). There is another, more general, data dependent optimization where instead of line segments it is x-monotone chains of line segments that are processed by the sweepline, giving O(n + (x+s)log x) runtime complexity where x is the number of x montone chains. In the worst case x is n, but if you have nice smooth arcs in your data formed of large numbers of small line segments the speedup is dramatic. Implementing the x-monotone chain sweepline is similar in principle to implementing the sweepline over curves.
There is a pretty significant difference between making the sub-optimal sweepline algorithm numerically robust and the optimal tree based sweepline algorithm numerically robust, so you might want to step back and decide whether you are looking to improve the robustness of your current algorithm or to improve the algorithm then come back to robustness. I have been mentoring a successful Google Summer of Code project this year that implemented an optimal and robust sweepline algorithm for the vornoi diagram of points and soon line segments. The robust and optimal sweepline for line segment intersection and polygon clipping is also achievable, but very tricky. Your work shows that you know how to test your code well, which counts for a lot. I'd like to encourage you to persevere in pursuing robustness as well as algorithmic optimality.
Regards,
Luke
Boost-users list run by williamkempf at hotmail.com, kalb at libertysoft.com, bjorn.karlsson at readsoft.com, gregod at cs.rpi.edu, wekempf at cox.net