Hacker News new | past | comments | ask | show | jobs | submit login
Converting stroked primitives to filled primitives (2020) (w3.impa.br)
64 points by vg_head on May 18, 2021 | hide | past | favorite | 7 comments



I read this paper pretty recently, and I was interested in how they handled dashing, since arc length parameterizations are one of the harder problems with Bezier curves (useful for making even steps along the length of the curve, rather than in the parametric domain). I implemented their suggestion of Juettler's "vegetarian" reparameterization for arc length [1] but was pretty disappointed in the results. The arc length steps were visibly different for quads of any "reasonable" size (i.e. not tiny). That led me to some followup work [2] which offers a method for a piecewise reparameterization - much much better results even with just 4 or 5 pieces. Figured I'd pass along the experience for any future interested readers.

[1] Juettler et al. https://www.sciencedirect.com/science/article/abs/pii/S01678...

[2] Costantini et al. http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.90....


Do you have any insight whether these reparameterizations are better (faster, more accurate) than just applying a standard solver (I'm now a huge fan of the ITP method) over numerical integration to compute arc length (Lagrange-Gauss quadrature is pretty magical)? I'm particularly interested in how it deals with cusps, which are always the most difficult cases with these numerical approaches.


Well, the reparameterization idea is quite convenient if the operations you want to perform are more naturally expressed in the new parameter domain. Path dashing is a good example: in arc-length domain, simply take fixed steps in u (say 0.05), perform some fairly simple arithmetic to convert that u to regular old t, and then evaluate your curve at t. The new domain gives you a new space to live in, that for some things is pretty nice.

For accuracy/precision, my [non-peer-reviewed] sense is that numerical methods will probably always get you more accuracy/precision. For the case I was experimenting with (path dashing, and some other related things like placement of text glyphs along the path), extreme precision didn't seem necessary.

The method I ended up with (Costantini's piecewise-linear solution with K=5 pieces) meant that for each Bezier in my path, I performed some preprocessing to determine the reparameterization coefficients. With that preprocessing done, I now had an O(K) evaluation algorithm to convert the arc-length parameter u (in [0,1]) to t.

Another surprisingly big bonus: there exists an equally simple O(K) inverse, i.e. given a t, return u. This allows you to easily compute the distance you've traveled along the arc, if you know where you are in the original parametric domain.

To your other question, unfortunately I don't know how these techniques tend to handle cusps. I didn't get that far in my experimentation.


Also see Polar Stroking[1] which covers similar ground.

[1]: https://arxiv.org/abs/2007.00308


After a brief scan through the paper, I was surprised that Polar Stroking wasn’t referenced. Do you see a reason why? Both the cusp handling and the survey of other path renderers seem like relevant prior work.


It could be as simple as timing, the polar stroking paper probably hadn't come out at the time this paper was written.


Indeed, I think both were presented at siggraph 2020




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: