Hacker News new | past | comments | ask | show | jobs | submit login

Well it doesn't rely on using a giant buffer for multiple monitors so it supports hidpi scaling. Have you ever tried hidpi on xorg? It's a mess, even on Ubuntu where they tried their best to clean it up. The xorg apps from the compatibility layer still suffer from it, probably until they update to Wayland.

Sidenote: Google tried to use xorg for ChromeOS and ended up writing their own UI system for hidpi scaling among other things when it didn't work

Some relevant links for those curious about other people hitting into this:

https://www.foell.org/justin/simple-hidpi-monitor-scaling-wi...

Gnome team:

https://wiki.gnome.org/Initiatives/Wayland/XWayland




Setting DPI in .Xresources had worked fine for me since 2014 except for Firefox and Chrome which took a year or so to adapt


https://donhopkins.medium.com/the-x-windows-disaster-128d398...

>My super 3D graphics, then, runs only on /dev/crt1, and X windows runs only on /dev/crt0. Of course, this means I cannot move my mouse over to the 3d graphics display, but as the HP technical support person said “Why would you ever need to point to something that you’ve drawn in 3D?”

>Of course, HP claims X has a mode which allows you to run X in the overlay planes and “see through” to the graphics planes underneath. But of course, after 3 months of calls to HP technical support, we agreed that that doesn’t actually work with my particular hardware configuration. You see, I have the top-of-the-line Turbo SRX model (not one, but two on a single workstation!), and they’ve only tested it on the simpler, less advanced configurations. When you’ve got a hip, forward-thinking software innovator like Hewlett-Packard, they think running X windows release 2 is pretty advanced.


Try to set two different DPIs in that.


You certainly can, but the problem is that there is no agreed standard from the toolkits to do it. E.g. Qt has its own way.


Why does QT needs to be smarter ? Just asking.


Qt runs on platforms where there is literally only a kernel + the Qt app running, through for instance EGL for rendering. So it has to have its own way.


I don't have a second monitor right now to double check, but I have this in a script for our office monitors to get a different dpi for one display (named eDP-1):

  xrandr --dpi 100/eDP-1


That’s for a single monitor. Now try two monitors with different DPIs


The final "composing" step in Wayland does require Wayland to use a "giant" buffer as well. Also My GPU has 16 GB of GDDR how can any frame buffer be "giant" in that context?

The support of hidpi scaling is something Toolkits have to manage (on both Wayland and X11). On X11 all the necessary pitch information to do so is available via the xrandr extension.


You really don't want to use Xrandr information to do scaling, it's not accurate. A better way would be to set a property on the window and have the compositor read that, AFAIK this is the solution that the Kwin developers were working on.


What's hidpi? I mean, pragmatically. I can guess that that acronym means "high dots per inch", but I don't follow.

One of my desktops uses 2x 4k monitors, is that hidpi? X works fine... X also worked fine on an older setup with 4 monitors.

Also, I don't know Wayland internals, but I'm gonna assert without proof that any low level graphics interface is gonna involve a framebuffer at some point...


Basically:

You have a Microsoft Surfacebook, Chromebook Pixel, or Macbook with retina display, let's say at 300 DPI, and you try to plug it into a monitor that is 70 DPI. The buffer will not be possible to adjust to properly handle both monitors because 1 is 300 DPI and 1 is 70 DPI. This is because xorg is internally designed around the philosophy that every monitor will have the same DPI. There are tricks and hacks that teams, specifically I've seen the canonical team pull this off, where they can "trick" xorg into displaying properly with some minor visual artifacting. The last time I tested this was with Ubuntu a few years ago, back when Wayland support was "experimental" and I had this exact sort of setup, and Wayland was easily able to pull this off 100% better in every way because it is not designed around this limitation.


> The buffer will not be possible to adjust to properly handle both monitors because 1 is 300 DPI and 1 is 70 DPI.

The word "properly" here is quite subjective, though. I have a similar setup, combining monitors of different pixel sizes. It works properly with Xorg: When I move a window of size WxH pixels from one monitor to another, it remains a window of WxH pixels. A thin line of one pixel width remains so. A checkerboard pattern that tingles the monitor refresh rate remains so. This sounds like perfectly appropriate behavior, and anything else would be ridiculously inappropriate. I accept that this usage of "appropriateness" is subjective to a subset of people that includes me, but it may not be universal.

Whatever, this is such a thin point for Wayland: hey, you lose copy-paste, screenshots, xdotool and most of the apps you used before. But hey! you can combine monitors of different pixel size as if they had the same pixel size! See, the thing transparently re-scales your windows using cheap-ass bilinear interpolation so that they get the exact same size in millimeters! Fancy, isn't it?

No. It's creepy.


> hey, you lose copy-paste, screenshots, xdotool and most of the apps you used before.

Wayland has had working screenshots, screen recording, clipboard functionality for text and arbitrary mimetypes, etc. for years on wlroots, GNOME, and KWin. It also has ydotool, an xdotool alternative. For pure keyboard automation it also sports wtype.

Which apps don't support Wayland? The only ones on my machine that need XWayland are some video games and FLTK apps like Dillo (I have to test with xwininfo since it's normally impossible for me to notice if a program is using XWayland). All GTK/Qt apps work OOTB on Wayland as a first class citizen, especially since those toolkits nowadays receive more Wayland testing than X11 since barely any current distros still ship X in their default installations.

Which Wayland compositor/version and GUI toolkit/library gave you blurry bilinear filters?


> barely any current distros still ship X in their default installations.

Bold statement. Citation needed? Debian does, and that's hardly a small percentage of market share. I guess if you include Android as a Linux distribution then you might be correct.


Debian 10, Debian 11, OpenSUSE, Fedora, RHEL 8, Ubuntu, and others ship GNOME on Wayland by default right now. It receives better support than the X version.

KDE upstream is also Wayland by default which is reflected in the KDE version of OpenSUSE, the Fedora KDE spin, and Fedora Kinoite.


No, Android's doing its own thing, not X11 or Wayland.


Ah, I think I get it. You want to use a "point" abstraction, not a "pixel" abstraction, right?

In your example, I imagine everything would work just fine, but images will be physically bigger on the 70 DPI monitor because, well, the pixels are bigger. I'm guessing that Wayland adds a layer of indirection, resampling the framebuffer based on the DPI to preserve the same point size across displays with different pixel sizes.

I ascribe to "the end user is always right" philosophy, so, if you want that functionality, you should be able to have it. But I wonder how popular that desire is. It's far from universal.

Personally, I find resampled images to be so distracting and distasteful that it makes a computer hard to use for more than a few minutes. I hate fuzzy text, fuzzy windows, and fuzzy pixels. Any time I've been forced to use a device at something other than its native resolution, I've gotten preoccupied with "how can i fix this ugly crap" until I get native resolution working, and I'm sure I'm not alone.

It's cool that Wayland does that for you, because you want it, but consider that to others, it might be a bug, and not a feature. :)


You are incorrect in your belief that a modern Linux graphical stack centered on Wayland ever resamples or interpolates a virtual or actual framebuffer except for compatibility with apps that have not been adapted for Wayland. (Those apps use something called XWayland, which is blurry if the user has configured a non-standard size for things.) 99.9% of fonts these days, most icons provided by the OS, and most other graphical elements (e.g., rounded corners) are specified as mathematical descriptions of curves that can be rendered at the display's native resolution regardless of what size the user chooses them to be. (I believe the 0.1% of fonts that are not mathematical descriptions of curves are called bitmap fonts.)

Note that I am not talking about multiple displays, but rather a single display, but the user wants things to be bigger than they are by default.

I know this because have used all 3 major operating systems recently (MacOS, Windows 10 and Linux/Gnome) on plain-old 96-DPI monitors such that if there were any interpolation or scaling algorithm I would be able to see the effects.

On a monitor 1680 pixels wide, Gnome gives me a choice of the scaling factors 100%, 125%, 150%, 175% and 200%. On a monitor 1920 pixels wide, Windows 10 gives me a choice of the scaling factors 100%, 125%, 150% and 175%. Some apps are blurry as hell, but on both Windows and Gnome I was able to live my life using only those apps that don't have the blurriness problem, but then I am not forced to use old Windows apps provided or specified by my employer and don't need to share my screen (which I am told does not work yet on a pure-Wayland set-up like mine).

On Windows 10, I used Google Chrome, some apps such as Settings and Timer provided as part of the OS and VS Code. On Linux I used (and continue to use) Google Chrome, modern Gnome apps and Emacs. On Linux, I need to open Chrome with specific flags to get it to be non-blurry while the visual elements are a non-standard size, and I need an Emacs built from a git branch called feature/pgtk. Both apps have some bugs when used in this way, but the bugs are definitely live-with-able, and I expect the bugs to be fixed eventually.


> You are incorrect in your belief that a modern Linux graphical stack centered on Wayland ever resamples or interpolates a virtual or actual framebuffer except for compatibility with apps that have not been adapted for Wayland.

This can be true only if you use a really narrow definition of "apps adapted for Wayland". For example, a viewer of png files may not be possible to adapt to wayland? A png file is just an array of pixels (typically without a meaningful dpi). How are you going to ever display it without interpolating and resampling?


Yeah. I suppose this functionality isn't everyone's cup of tea. The cool thing about Wayland though is somehow it makes the windows still look crisp and good while making them the same size. With the canonical hack to xorg, they always looked blurry and gross. The gnome wiki had this to say on it:

"On a Wayland display server, each connected monitor will have a scale that depends on the DPI. A scale affects the scale clients draw surface contents when they are visible on said monitor."

So it seems like it actually modifies the drawing at the application level rather than rescaling it like an image.


> So it seems like it actually modifies the drawing at the application level rather than rescaling it like an image.

X also supports exactly this. Qt uses this information. gtk doesn't for whatever reason, forcing the workarounds.

Most the so-called "X limitations" are actually just toolkit bugs if you get into the weeds.


GTK doesn't support it because nobody implemented it. IIRC the Qt support is also pretty buggy and doesn't work on some compositors.


Then what is your complaint? No one is stopping you from using X for as long as you like. Meanwhile the majority of users that want sane, modern handling of HiDPI can move on.


> Then what is your complaint?

What complaint? Why do you think I'm complaining? I wanted to know what's so fundamentally wrong with X to warrant a massive developer effort to replace it.

Based on the comments here, I have my answer: Nothing's wrong with X. Also, something lighter weight is desired for embedded platforms.

Some fraction of the population, like you, wants their display system to render in terms of points, not pixels. To me, and many others, this would be hell, but hey, different strokes for different folks -- good software does what the user wants, not the developer's will.

> ...majority of users that want sane...

You troll. :P I think the majority of users want sane display systems that don't resample and change resolutions away from the native resolution of the hardware. To you, the majority of users want fuzzy text and ugly UIs. We're both wrong to assert this as some obvious fact, because neither of us speak for any user but ourselves!

If I have any complaint (which, again, I'm not sure if I do!) it's that I don't appreciate my reliable computing setup breaking just because some young developer decided the programs I use are "too old".


if you buy a high pixel density display (ex a 15 inch laptop with a 4k display) then you need to be able to map 2 real pixels to 1 logical pixel (what apple did with retina displays)


And this works perfectly fine already with X11. You only encounter issues when you add an additional display with different DPI. But dual monitor setups are the exception, and mixed DPI setups are even less common.


Mixed DPI setups are very common. A Retina Mac getting docked to external monitors of different DPI is tremendously common all over the place. Ironically on the Windows side I see tons of non HiDPI laptops that end up docked to higher DPI external monitors, but the other direction isn’t exactly rare either. Even if you don’t use those monitors simultaneously - this is still mixed dpi for all the apps that are currently running.

This does not work smoothly at all in X11. This is a mixed bag with Windows because of reasons that are well documented. The compromise chosen by Apple works reasonably well.

This is not about my personal preferences unlike the GP asserts, it’s based on observing hundreds of deployments in industry and academia - which both Apple and Microsoft are very much aware of - they don’t try to support these features for no reason at all. I personally don’t love the scaling done by Mac OS, and in an ideal world apps would just magically scale geometry and “do the right thing”, but that is a fantasy land and the best FLOSS unix has to offer with X11 does not meet a lot of users needs.

As for nothing wrong with X? Lol this is a display system that still does not have a sane unified way to prevent screen tearing - despite 30 years of shitty attempts.


Not really, you will probably have to deal with this any time you connect your laptop to a projector.


That’s why eg. Macs default to integer-only scaling. But using them at 1x will make everything unusably small.


Basically it is the same for Win 10. It insists on using small fonts on my laptop so i have to scale it. In the golden age you could tell X the size of your screen.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: