Hacker News new | past | comments | ask | show | jobs | submit login

It's funny, though, now that Retina is a thing, how much superior OS X font rendering algorithm is. This was not very clear when fuzzy low res displays were the norm. Back then, I vastly preferred Microsoft 's ClearType approach. Now, it's not even a contest -- Windows does not represent fonts truthfully.



Its pretty clear that OS X has existed for what 15 years most of that time without retina screens it didn't retroactively become better because screen resolutions improved to make what was presumably an inferior algorithm in the prior context work better in the modern context.


But it did. OS X (and Mac OS before it) were optimized for fidelity with higher resolution laser prints. This necessarily caused more anti-aliasing artifacts then with pixel snapped text. It was actually less of a problem with CRT monitors where individual pixels blended better with each other.

This is not a shallow concern, pixel snapping can cause major issues like line breaks differing between print and display.


Interesting I would never have considered print publishing. That said.

Was this configurable and what percentage of users were concerned with producing print copy? In what eras?

I would but that these individuals were always a minority and the arrangement appears to be inferior for the majority of users for the majority of the time it was a factor.


This was a Steve Jobs obsession from his first calligraphy class at Reed College. Proper proportional fonts were built in with the original Macintosh - this wasn't something he was going to do a market analysis on to decide for him.

However historically this was less problematic because screens were fuzzier anyways. It was only with LCDs that the anti aliasing became obvious.


The Macintosh may have had a very small market in the 90's but one market that it did dominate was print.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: