Hacker News new | past | comments | ask | show | jobs | submit login
Font height differences between Windows and Mac (2019) (williamrchase.com)
56 points by mountainplus 8 months ago | hide | past | favorite | 44 comments



CSS now has the ability to override some font metrics, so you shouldn't need to edit the font files directly.

And these overrides can be applied to local fonts as well (generally used to ensure the metrics of the local fallback font matches the yet-to-be-downloaded web font, preventing a layout shift when the web font is swapped in)

ascent-override - https://developer.mozilla.org/en-US/docs/Web/CSS/@font-face/...

descent-override - https://developer.mozilla.org/en-US/docs/Web/CSS/@font-face/...

line-gap-override - https://developer.mozilla.org/en-US/docs/Web/CSS/@font-face/...

size-adjust - https://developer.mozilla.org/en-US/docs/Web/CSS/@font-face/...


Re the stuff about @-moz-document at the end, does anyone remember exploiting IE bugs in parsing CSS to present different rules for IE vs firefox, especially to work around IE's broken box model? I remember actually using this[0] hack somewhere long ago:

    div.content { 
      width:400px; 
      voice-family: "\"}\""; 
      voice-family:inherit;
      width:300px;
    }
I can't believe the stuff we used to go through.

[0] http://tantek.com/CSS/Examples/boxmodelhack.html


Anyone remember the Meyer CSS reset?

https://meyerweb.com/eric/tools/css/reset/

Version 1 was published in 2008:

    /* http://meyerweb.com/eric/tools/css/reset/ */
    /* v1.0 | 20080212 */
    
    html, body, div, span, applet, object, iframe,
    h1, h2, h3, h4, h5, h6, p, blockquote, pre,
    a, abbr, acronym, address, big, cite, code,
    del, dfn, em, font, img, ins, kbd, q, s, samp,
    small, strike, strong, sub, sup, tt, var,
    b, u, i, center,
    dl, dt, dd, ol, ul, li,
    fieldset, form, label, legend,
    table, caption, tbody, tfoot, thead, tr, th, td {
      margin: 0;
      padding: 0;
      border: 0;
      outline: 0;
      font-size: 100%;
      vertical-align: baseline;
      background: transparent;
    }
    body {
      line-height: 1;
    }
    ol, ul {
      list-style: none;
    }
    blockquote, q {
      quotes: none;
    }
    blockquote:before, blockquote:after,
    q:before, q:after {
      content: '';
      content: none;
    }
    
    /* remember to define focus styles! */
    :focus {
      outline: 0;
    }
    
    /* remember to highlight inserts somehow! */
    ins {
      text-decoration: none;
    }
    del {
      text-decoration: line-through;
    }
    
    /* tables still need 'cellspacing="0"' in the markup */
    table {
      border-collapse: collapse;
      border-spacing: 0;
    }


Yes, but than the <!--[if IE]> and friends made those hacks pretty rare.


Speaking of font differences, the default Windows emoji font is so bad I don't understand why Google doesn't just ship their Android emoji font with Chrome and default to that.


"Funny" thing about the Windows emoji font, is that flags are not included. I guess Microsoft is worried it can hurt their business (e.g. Israeli flag, Palestinian flag, Taiwanese flag, etc...).

So, if you need flag icons for any reason (e.g. phone number country code input), you'd have to use a different emoji font.


The windows 10 ones are atrocious but I quite like the windows 11 ones.

https://techcommunity.microsoft.com/t5/microsoft-teams-publi...


The comments on that article are hilarious. Never in my life would I have expected someone to complain about a lack of professionality in an emoji set. Since when are emoji professional?


> Since when are emoji professional?

If you're asking whether we use emoji in professional setting to convey important information, yes, we do, in spades.

Imagine you see an alert in slack about a critical system. You can type a full sentence to explain you've seen it and will investigate what's happening, or you can stick an :eyes: reaction on it and actually focus on investigating.

Same if you want someone to wait a bit because you're thinking etc.

You can always type full sentences to convey the same information, but in a professional setting conciseness and efficiency are also valuable, right ?


Agreed. It's not just about smiley faces. I use all the little caution signs, clocks, hourglasses etc in UI design all the time. Easiest way to get icons in places they are a pain to put.


pic says thousand words

emoji says about ten

still an improvement


Problem is they don't say the same thing to everybody.

Do eyes emoji mean you are looking at it, or you want me to look at it, or that you have seen it? They don't really say anything.


There's a blurriness, but it's still pretty workable. You won't ask someone to look at a problem by sticking eyes on it, that needs a lot more communication.

But you can look at an issue, see there 3 of your coworkers sticking an eyes emoji on it, and mention them when asking what they think it, get clarification if they've just seen and didn't care, are still digging deeper etc.

In a way, the imprecision is what gives the versatility, otherwise many emoji get a more limited, standardized meanings over time. Like a "done" sticker that will be enough to give a status on a request on a channel, a thumbs up on a proposition, or a green checkmark on things that explicitely needed to be checked.


I listened to an expert on conflict resolution once say:

People get pretty good at reading body language. The problem is you never know what conversation the person is having with themselves.


It means "you won't get any other reaction from me about it than an awkward look" ;)


At least the last couple of years. In fact, actual court cases have been fought about their meaning, those cases included actual depictions to try to point out that some were meant to be whimsical, and some were meant to be serious.

https://news.bloomberglaw.com/bloomberg-law-analysis/analysi...



These look terrible to me. It’s like everything was drawn by a child and basic emotions are dialed up way past 11.


They look creepy to me, quite tacky too.


I liked the Windows 10 emojis with the thick border...


I quite liked them as well. Made them very readable even in small sizes.

But then again, I also loved the Android's old blob emojis[0]. Some specific ones were weird but mostly I really liked the amount of personality and movement they were able to express. But as far as I've been able to tell, most people seemed to hate them, for whatever reason.

[0]: https://emojipedia.org/google/android-6.0.1


Most people I know really enjoyed the blobs too. I never bought Google's justification for the redesign, and I don't think they ever published hard data about it.


Using FontForge to regenerate an export is kinda drastic. What you want is ttx from the fontTools font manipulation toolset. ttx can generate an editable XML file with the font’s tables that are easy to make changes in. hhea is obvious to find; for “win”, you probably want to look inside OS/2. If you can afford it, the go to application for this sort of post-production alterations is DTL OTMaster.


Macs are king when it comes to font rendering. It was prioritized in the original Macintosh and Apple still has the most solid font engine to this day.


MacOS is literally the only major desktop OS where sub-pixel rendering was removed on purpose, despite the negative impact on everyone using <200 dpi monitors, i.e. most office workers at companies that won't spend $1500 on 5K monitors.


Probably 90-95% of Macs sold come with a monitor, and they're all high-DPI now. Apple doesn't really care about optimizing for some giant 1080p monitor. An iMac is pretty cheap.


In my world, Macbooks are pretty popular in the corporate world. Sure my MBP 16" has a high DPI built-in display but I'm never going to get the budget for an Apple Studio Display. Offices are equipped with Dell/Lenovo/HP USB-C monitors that are between 100 and 150 dpi. I'm not talking about 20 year old pixel density here, but modern ultra-wide or UHD monitors.

There's a difference between not caring about low DPI anymore, and crippling font rendering on purpose.


> I'm not talking about 20 year old pixel density here, but modern ultra-wide or UHD monitors

Because the monitor industry largely got to 4K and said "eh that'll do", a lot of those "ultrawide" and UHD displays with large sizes literally do have twenty year old DPI.

Increasing physical panel size (usually) used to translate to higher resolution, but at some point the majority of manufacturers stopped doing this, so you get the same 4K resolution at ever stretched physical sizes, and ever decreasing DPI.


Be serious, twenty years ago 90 dpi was pretty common.

Apple sold their 27" 109 dpi monitor until 2016 and killed subpixel rendering in 2018.


This is a tangential rant about HiDPI.

macOS doesn't even render to its own high pixel density displays correctly, owing to the (in my opinion) very naïve algorithm used. If you select any resolution that's not a perfect factor of the display being rendered to, then there is blurriness[1]. MacOS renders to a viewport that is 2× the resolution of the 'looks like' setting, and then scales it down to the actual monitor resolution. Clearly, at any non-integer multiple resolution, there is blurring.

This is problematic enough that it defeats Apple's 'good font rendering'. I see shimmering and ringing artifacts around regions of high contrast (i.e. essentially all text) with such a non-native setup. I am forced to use the integer factor resolution, which makes things much too big. Of course, I can scale my browser and VS Code, but besides that the rest of the OS is comically large. Needless to say this also comes with the large performance impact of always rendering to a viewport four times the resolution of a given display. It is also non-intuitive to program against, especially using APIs like GLUT, SDL, etc.

Windows is the only OS that actually does high pixel density rendering correctly for programs that support it[2]. Windows works with the given monitor resolution, and scales UI elements according to the percentage value set (100% is 96 DPI). This is a lot more involved to program for, but when done right, it works exceptionally well. Everything that's not a raster image is always pixel-perfect. If it's not (and people have complained about this[3]), then there's a system setting/registry patch to make it so[4].

Windows also handles moving program windows between displays set to different DPIs quite seamlessly. The only issue I see is when a new display with a different scaling setting is set as the primary (and only) display, and then Windows Explorer scales things weirdly—which is fixed by restarting Explorer.

On Linux... Forget it. On Xorg there are a million environment and per-app-specific configurations to set (just see how long the HiDPI article[5] in the Arch Linux wiki is). On Wayland, things are better, but not yet for me, since I use an NVIDIA graphics card, KDE Plasma, and Chrome, which is the worst possible combination for Wayland. It's not mature enough for this setup—the Windows-esque rendering (they call it 'fractional scaling') was only merged in slightly more than a year ago[6], and Plasma 5, my DE of choice, still doesn't quite use it yet.

[1]:

[2]: https://building.enlyze.com/posts/writing-win32-apps-like-it...

[3]: https://news.ycombinator.com/item?id=38444967

[4]: https://serverfault.com/questions/570785/how-can-i-make-micr...

[5]: https://wiki.archlinux.org/title/HiDPI

[6]: https://gitlab.freedesktop.org/wayland/wayland-protocols/-/m...


> the Windows-esque rendering (they call it 'fractional scaling') was only merged in slightly more than a year ago[6]

wp-fractional-scale-v1 is not necessary to implement fractional scaling; it's there to make it easier and to solve some edge cases. It was inspired by already existing fractional scale implementations.


Except when you use a “non retina” display.


Sounds plausible but do you have any evidence?



Fun fact on the first Windows 95 fonts in Arabic, is that Microsoft decided to go cheap. and not to pay Boutros Fonts, a London-based type foundry that design Arabic fonts, and bought a cheaper derivative 'pirated' copy of that same font for $5k. A lengthy legal battle ensued with Microsoft ultimately winning it out of might of its $$ and legal team.


I heard a similar story between Arial and Helvetica.


I struggled with this recently, upgrading qtwebkit from Qt5 to Qt6. Qt5 used platform-specific height values, while Qt6 now uses win height values, for consistent rendering across all platforms. It’s better in concept, the only issue is that the really big cross-platform browser _doesn’t_ so it turns into “why doesn’t this font rendering look like Chrome on Mac/linux?”


It’s not better “in concept,” it’s worse. Not only is it a non-native API, it’s abandoning one aspect of even attempting to mimic being native in favor of “branding.”

If someone insists that an app on the Mac should look the same as on Windows they’re an idiot. If they say an app on the Mac should look the same as a web page on Windows or Linux they’re a malicious idiot.


How do you solve the problem?


Use native frameworks on each platform for which you’re building an application, with a cross-platform non-UI core.


> The ascent is the distance from the baseline to the top of the tallest glyph, so typically 1em. The descent is the distance from the baseline to the lowest point in any glyph. The descent can be different because on web fonts, glyphs like g or p can have tails that extend below the baseline.

Huh? Different from what? It's described as exactly the same thing as the ascent, but down. And why does the author specify web fonts? This sounds like it applies to fonts in general.

Edit: I think the author is just trying to say the descent varies across fonts, even measured in em. I think "web fonts" and "glyphs can have tails" are just red herrings.


The blog text intersects with the mountains at the bottom, literally unreadable

https://imgur.com/02xrZ2q.png


Literally readable.

    fodder for a horror movie if I've ever heard it.


> What's that? You say you want even more infuriating font stories? Well don't you worry, I'll be back soon with another diatribe about font thickness and antialiasing on the web on Mac vs. Windows.

Did he ever write about this? I can't see anything about it in his list of articles.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: