A few years back I wrote a basic bezier curve library with boolean operations in C/C++. Bezier curves are fantastic when you get started, they're easy to deal with and the math is well documented online.
However the edge cases are nightmares to deal with. Intersection (and similar operations) typically work by approximating via subdivision, so you'll never get an accurate answer, which is important when dealing with boolean operations. You'll wind up piling on hacks. One edge case I wound up being due to a floating point arithmetic error, which was fun because my computer science degree finally wound up coming in handy.
It makes me appreciate the work that the guys who made paper.js. They've done some outstanding work, and it's by far the most superior open source vector graphics framework out there.
> One edge case I wound up being due to a floating point arithmetic error, which was fun because my computer science degree finally wound up coming in handy.
These are everywhere whenever you do anything with vector graphics in the real world. Lots of interesting computational geometry papers end up useless in reality because floating point error kills you if you try to implement the algorithms. It's pretty sad…
The other issue that comes up again and again is self-intersecting paths. Most published algorithms you'll find don't bother to support them, which also makes them not very useful in reality because self-intersecting paths are everywhere.
I'm very skeptical of clever vector path algorithms these days. The amount of work needed to get them to work in the real world is rarely worth it. Better to spend the time reformulating the problem in such a way as to avoid needing them.
> From the beginning of the implementation of Triangle, and well into the
development of Pyramid, floating-point roundoff problems plagued me.
Each program would sometimes crash, sometimes find itself stuck in an
endless loop, and sometimes produce garbled output. At first I believed
that I would be able to fix the problems by understanding how the
algorithms went wrong when roundoff error produced incorrect answers,
and writing special-case code to handle each potential problem. Some
of the robustness problems yielded to this approach, but others did not.
Fortunately, Steven Fortune of AT&T Bell Laboratories convinced me, in
a few brief but well-worded email messages (and in several longer and
equally well-worded technical papers), to choose the alternative path to
robustness, which led to the research described in this chapter. For
reasons that will become apparent, exact arithmetic is the better
approach to solving many, if not all, of the robustness worries associated
with triangulation.
Of course, what can be done with polygons is often not feasible for more general kids of shapes like parametric cubics.
* * *
What is a “clever vector path algorithm”, and how do you avoid them?
Algorithms that gave me floating-point related problems:
- Fast polygon triangulation/partitioning. There's a reason why people use ear clipping in practice despite it having bad asymptotic time complexity. Even ear clipping isn't very useful for robust path rendering because it doesn't handle self-intersection. Tiling and stenciling to compute winding numbers is a much more robust approach than tessellation.
- Generation of bounding geometry for analytic antialiasing of polygon edges, to work around the lack of conservative rasterization on GPUs. Skia has impressive code that handles this, but it is incredibly painful and I decided it wasn't worth it.
- Determining polygon convexity. This even led to a security bug in Skia [1].
- Clipping of Bézier paths, especially in homogeneous coordinates. Yes, this even includes clipping to axis-aligned rects. It's too hard. Convert to triangles or quads and clip those.
- Any sort of Boolean operation on Bézier paths, if you want reliable results. Leave those to applications like Illustrator, where artists can notice if the algorithm goes wrong and fix it up manually when it does.
> The other issue that comes up again and again is self-intersecting paths. Most published algorithms you'll find don't bother to support them, which also makes them not very useful in reality because self-intersecting paths are everywhere.
Exactly this. This is the biggest challenge when doing useful stuff with curves in graphics design/3D.
Curve offsetting is a common tool to produce paths (2D), extrusions & bevels (3D) etc. Whenever the offset distance exceeds the local radius of curvature intersections occur.
When you have some clever algorithm to deal with those, e.g. clipping them out, you're then faced with offset curves that have different parameterization than the original. Try to create a patch mesh from those then.
This is a huge brain fuck and every time you think you got it worked out another edge case you didn't think about will creep up from out of nowhere.
It does look amazing. I tried using it for a drawing app, though, and found it too slow. Maybe I was using it wrongly, but long or large numbers of strokes caused issues as it seemed to redraw the whole image more than necessary. I had better luck with just plain canvas, but I'd like to investigate CanvasKit (Skia in WASM) as well, as their drawing demo seems very performant.
For a technical analysis of Hobby splines I recommend HN raphlinus's PhD thesis Section 4.7 [0] referenced just yesterday in the submission on Google Font Analytics [1].
Ah, thanks—I didn't see that mentioned in the article at all. It definitely seems like the most important property for a curve that someone will be drawing should have.
> Because they are perfectly compatible with industry-standard Béziers, there is very little disadvantage to be had.
Anyone know what "compatible" means as used here?
As an aside, I personally prefer NURBS, they feel much more "pure" if that makes any sense. Plus they can produce real circles, which is a pretty nice feature of a curve-system.
I interpreted it to mean that they are in fact a subset of Bezier curves. A computed smooth-spline (cant remember what they’re called) is still stored using the exact same information as any other Bézier curve - but the extra rules about its construction just make them smoother than canonical bezier
That's right the Hobby splines are a strict subset of the polynomial cubic Béziers.
The polynomial cubic Béziers are defined by 4 control points. In 2D that's 4 x 2 = 8 degrees of freedom. In the terminology of HN raphlinus's PhD thesis, we fix the 2 endpoints, which removes 4 degrees of freedom, and they're referred to as a 4-parameter family.
In addition to the 2 endpoints, Hobby splines are defined by the start and end tangent angles Θ0 and Θ1, plus a tension parameter τ.
If we fix the 2 endpoints and the tension parameter τ, and allow Θ0 and Θ1 to vary, then we have a 2-parameter subspace of the 4-parameter space of polynomial cubic Béziers for 2 fixed endpoints.
If we allow the tension parameter τ to vary we have a 3-parameter subspace.
So we have lost 1 degree of freedom, but in terms of aesthetics we haven't lost so much because the majority of the splines that we've thrown away are likely uglier than the splines we retained.
One TL/DR from HN raphlinus's thesis: Euler spiral splines are about the best you can do for a 2-parameter family that cover all reasonable possibilities for Θ0 and Θ1 with good aesthetics.
> Our curves consist of one quadratic curve, with control points [...] per interpolated point p_i
So what happens is each segment between two knots (in the paper) is made of 2 pieces of parametric quadratic polynomials (i.e. parabolas).
Each knot is at the vertex of the same parabola to each side, which can be broken at the vertex into two parts, each of which belongs to one of the adjacent segments.
The solver in the paper makes sure that at the joint between two parabolas, the absolute magnitude of curvature matches.
This means there aren’t really any smooth inflection points and you always get a discrete jump in curvature at an inflection. This looks bad, so in practice Adobe implemented something different in their software.
So the name biquadratic is analogous to biarc. Although the usage here is slightly different to a biarc because it's the same parabola each side of the vertex.
The choice of the name biquadratic is possible unfortunate because it is already taken for an existing class of algebraic space curves [0]
>Biquadratics are algebraic curves of degree four that are the intersection between two quadrics that do not share a common line (in which case the curve would be of degree three).
then
>See also Cartesian ovals, Booth curves (including the lemniscate of Bernoulli), the Külp quartic, the Dürer shell curve, the bicorn, that are planar projections of biquadratics.
which was the source of my confusion as I had assumed that raphlinus's reference was to planar projections of these classical quartic curves.
Since there are some papers posted in the replies here, do you guys have any recommendantions on what are some journals that should be read by people working in the field of 3D graphics and CAD/CAM software development to stay ahead of the curve?
When I was fighting some splines at work I found this website very good and very interactive when explaining how Bezier curves work under the hood, a great resource!
I always found Bezier curves strange - they seem awful as a way to approximate curves. When you "draw" with a Bezier curve in Illustrator (or Inkscape), the lines tend to fly all over the place - they can be an OK approximation but they're often very unstable. The problem, I think, is that any polynomial or equivalent higher order is going to have this instability property.
However the edge cases are nightmares to deal with. Intersection (and similar operations) typically work by approximating via subdivision, so you'll never get an accurate answer, which is important when dealing with boolean operations. You'll wind up piling on hacks. One edge case I wound up being due to a floating point arithmetic error, which was fun because my computer science degree finally wound up coming in handy.
It makes me appreciate the work that the guys who made paper.js. They've done some outstanding work, and it's by far the most superior open source vector graphics framework out there.