Hacker News new | past | comments | ask | show | jobs | submit login

macOS will render at the next highest integer scale factor and then downscale to fit the resolution of your monitor instead of just rendering at the fractional scale in the first place



It’s effectively supersampling. The resulting image looks excellent.


There are several scenarios where it clearly doesn't look that good, and where Windows objectively does a much better job.

Most people (and companies) aren't willing to spend $1600 on Apple's 5K monitor, so they get a good 27" UHD monitor instead, and they soon realize macOS either gives you pixel perfect rendering at a 2x scale factor which corresponds to a ridiculously small 1920x1080 viewport, or a blurry 2560x1440 equivalent.


The 2560x1440 equivalent looks tack sharp on macOS. It renders at 5120x2880 and scales it down to native, as I said it’s effectively supersampling. I used this for years and never experienced a blurry image. I now run a 5k2k monitor, also at a fractional scale and again it looks excellent.


It very obviously is blurry, though. There's a reason so many people notice this issue, you're not going to be able to explain it away.


I have eyes, and recently updated subscription glasses, I can see what it looks like and it’s tack sharp.

Are you sure you aren’t confusing Window’s terrible font rendering with sharpness?


Does macOS support any scaling factors above 2x?




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: