Hacker News new | past | comments | ask | show | jobs | submit login

> Font rendering on Linux is as advanced and capable as any other OS including in the areas of kerning and hinting. It should just work without any user intervention and look great.

This isn't entirely true. AIUI, Apple enables LCD filtering and subpixel rendering by default, because it knows that you're using an LCD and what the subpixel order is. However, these are usually toggleable via the GUI, and even without them it usually still looks fine.




Since Mojave, MacOS nolonger does subpixel rendering.

https://mjtsai.com/blog/2018/07/13/macos-10-14-mojave-remove...


On non-retina displays, which... is nowhere near the normal use case for Macs at this point.


On all displays. It matters less on Retina displays.

External displays are nowhere near the normal use case?


No, my point is that this change only affects non-Retina displays. In my experience (worldwide, many companies, etc) anyone who uses a Mac with an external monitor generally doesn't settle for some POS. It's a high-end screen that matches the MacBook, hence why it's not that big of a deal.


The change affects all displays.

The highest supported resolution on a MacBook is still scaled down. Only a couple of expensive LG displays match the actual density as far as I know.

The MacBook Air only got a Retina display a few months ago. The low-end iMac is still 1080p.


People forget that subpixel rendering was a thing on CRT's before LCD's were around. Subpixel order is pretty universally standardized. Is there an EDID data element for pixel order?


I didn't know that! How does sub-pixel rendering work on CRT's, which (to my understanding) don't have a set matrix of pixels and subpixels?


It worked like shit, blurring perfectly fine text. At least the last time I had a CRT, which was very early 00's




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: