Hacker News new | past | comments | ask | show | jobs | submit login

Really? I find that it works significantly better then the crazy Apple approach.

Not a single problem with it.




Crazy? What is apple's approach?

Windows' approach is by just upscaling each window, which makes things blurry.

Linux' approach is by letting you set the DPI in dozens of config files (eg one for GTK+, one for QT, ...), but at least the end result looks "reasonably" what you want.


Windows upscales applications that do not support scaling automatically. This gives it perfect backwards compatibility. Windows applications that do support arbitrary scaling just get to see the world as it is, and also work perfectly because they have been designed to do so.

The Apple approach is some kind of misguided middle ground. Applications are fooled in to thinking that they are rendering on a "normal" sized screen but they are actually rendering 2x2 blocks for every single pixel. Then the whole screen is resized in to whatever resolution that it actually has.

Lets say you pick the "Looks like 1920 by 1200" option (why not just give me a percent scale?). The app thinks it's rendering at 1920 x 1200, it's actually rendering at 3840 x 2400, and then the thing gets scaled down to 2880 x 1800.

Can't you just let the damn app render at the actual resolution of the screen and give it a percentage scale?


Have you seen the results? It's crazy, but it works, and has the added advantage of not having to wait until "just next year" when all existing applications will be ported to our New Hot Vector Based display tech.


Sounds like waiting for apps to support retina displays :)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: