Hacker News new | past | comments | ask | show | jobs | submit login

What "Wayland people"? You probably mean the former Xorg developers who shifted to full time Wayland development long ago. Xorg is on life support, it has been getting bug fixes and nothing else for the past several years.

From reading the sibling comment, if BSD guys want to keep using Xorg, they'll probably have to maintain it themselves.




> Xorg is on life support, it has been getting bug fixes and nothing else for the past several years.

Imagine two cars: X11 its an old one, it doesn't quite start right, the windows are chipped, the paint is peeling of and no one really wants to invest money into maintaining it, it defaults to brakes and a steering wheel from 1980 but has seen continous upgrades over time and you can generally swap in a steering wheel from 2010 with minor problems.

Now imagine wayland, a brand new tesla, it doesn't have brakes or a steering wheel because history has shown that these concepts evolve and if anything it should be a third party provider that creates them. Who cares that it took ten years between the release of the car as ready for use and the first compatible steering wheel implementation? Who cares that getting it to run on half of the roads (NVIDIA) is still not a solved problem because they stripped out any abstraction.

> From reading the sibling comment, if BSD guys want to keep using Xorg, they'll probably have to maintain it themselves.

As opposed to wayland which pushed 90% of features on the KDE/GOME/etc. guys (to be reimplemented in dozens of incompatible APIs). Of course the people who wrote wayland also wrote X11 so removing themselves from the equation might have been the nicest thing they ever did, given their own opinion of their past work on X11.


>Who cares that getting it to run on half of the roads (NVIDIA) is still not a solved problem because they stripped out any abstraction.

a) nvidia's refusal to implement GBM in their driver was their own choice. The abstraction was never removed; GBM is the abstraction over all drivers.

b) nvidia already relented and implemented GBM in their driver.

The latter doesn't necessarily mean nvidia is a good choice of GPU even now, because it requires a proprietary driver, so compositor / Mesa / kernel devs cannot debug the full stack when anything goes wrong. So having your problems ignored is something you'll have to get used to if you choose to use hardware that requires proprietary drivers, regardless of whether you use it with X or wayland.

>As opposed to wayland which pushed 90% of features on the KDE/GOME/etc. guys (to be reimplemented in dozens of incompatible APIs).

wlroots exists to solve that problem. Whether an individual compositor decides to use it or not is up to the compositor.

At least in KDE's case, wlroots did not exist at the time they added Wayland support so of course it's understandable that they don't use it. There's a fork of kwin that uses wlroots ( https://gitlab.com/kwinft/kwinft ) but I believe it's just an experimental one-person effort rather than anything that kwin devs are working on as a replacement.


Afaik the Steam Deck will use KWinFT, so if Valve is willing to bet on it, I don't think it's such a fringe project.


josefx: Since your reply was flagged, I'll reply here.

>Still not going to amputate my leg over a stubbed toe even if RMS considers the toe cancer.

I didn't say you should. I worded what I wrote specifically to indicate that I'm not passing any judgement on whether you made the right choice or the wrong choice.

There are many people who bought nvidia GPUs because they worked fine, and were rightfully worried that they'd stop working fine if their DE of choice decided to switch to wayland or became abandoned. I empathize with their situation completely.

All I'm saying is that you made the choice to buy hardware that requires a proprietary driver, and so you have to live with the consequences of that choice. This is not something unique to this situation involving nvidia GPUs. Only you have the right to decide whether it was a good choice or a bad one.


[flagged]


I really don't see why a bunch of unpaid volunteers should bother to support the only player in town that refuses to play nicely and tries to strong-arm everybody else to use its technically inferior solution.

Here's a nice write-up. I can imagine how nice it is to spend all your waking time trying to improve the Linux graphics stack and then listen to all the bullshit that we see in this discussion.

https://drewdevault.com/2021/02/02/Anti-Wayland-horseshit.ht...

That said, from what I heard, before nvidia backed down, GNOME and KDE developers started adding support for it in their Wayland compositors.


> I can imagine how nice it is to spend all your waking time trying to improve the Linux graphics stack and then listen to all the bullshit that we see in this discussion.

Is this really surprising? End users want their stuff to keep working with minimal changes. End users don't know and/or don't care that X is hard to maintain. They seen wayland coming, and they're scared that some of their stuff that they currently use will break. How exactly do you expect people to react to:

> Maybe Wayland doesn’t work for your precious use-case. More likely, it does work, and you swallowed some propaganda based on an assumption which might have been correct 7 years ago. Regardless, I simply don’t give a shit about you anymore.


The effort to improve is welcome and appreciated.

Casting multi-use graphical computing aside is unwelcome.

Most of the toxicity centers on HOW that discussion has played out.

To be fair, many users coming up on single user graphical computing have no idea what the problem is, do not have use cases and we all know the rest.

The users who do understand all that are pissed. They are being told none of it really matters or is necessary, and so on...

Of course all that is definitely not appreciated however spiffy watching videos about it all may be.

And, those users totally get the need to see the embedded use case improvements get done, and still are being asked to just forget multi user graphical computing was ever a thing because reasons...

So yeah, here we are.

People will care about Weyland exactly as much as the Weyland team cares about them.

Multi user graphical computing aware users are not cared about at all despite their repeated explanations on how they depend on that capability. Depend, as in, not having it becomes very expensive for them. Expensive enough to wash away all the value they are being told matters more to everyone else.

Non multi user graphical computing aware people basically just want it settled so they can have few overall worries. They feel some care aimed their way.

The embedded people are super happy and are cared about a lot.

All adds up to a very toxic state of affairs.

Factoring it all down:

Did we absolutely have to trade multi user graphical computing away?

Should the answer be no, this whole ugly mess will go away.

Continuing with a yes means a very painful and drawn out mess for years to come.

And it will be that way because there are plenty of people who really do get a lot of value out of multi user graphical computing.

Who knew?

I bet the former X devs did. And they just do not care.

Why exactly should they get any back in return?


Xorg barely works on nvidia (source: me) vsync requires a compositor to work, and even picom needs a very specific combination of flags to get vsync to work. Chromium cannot gpu accelerate video playback. On rolling release distros, most major kernel updates leave your system broken, because nvidias out-of-tree driver can't build against new kernel versions. If nvidia stops supporting your old gpu, like they just did with Kepler, you're screwed.

The problem with nvidia on linux isn't wayland, it's nvidia.


I have used strictly Nvidia gpus, on debian Linux, for 20 years, using the binary drivers, and I have none of the issues you describe. It just works, very well, stable and fast. Xorg is an amazing piece of software and combined with openbox it does exactly what I need.


> Xorg is an amazing piece of software

Based on what? Because I rather believe its maintainers..


Maybe you didn't read my comment? Based on having used it for over 20 years.


That’s something about its stability. It says nothing about its fundamentally bad abstraction of modern graphics stacks, that can’t be fixed.


Users don't care about the underlying abstraction. The points raised before at least covered features that users cared about, even if they where presented without context, vsync worked fine on any NVIDIA desktop system I used, especially without compositor.


Users do care when the bad abstractions cause bugs that linger and don't get fixed, of which there are quite a lot...


Xorg works just fine with Debian and Nvidia (source: me). Some longish (5-10) years ago it got so stable that I managed to stop thinking about it, because it Just Works. Even across upgrades. Your problem lies not with Linux, Xorg, or Nvidia.


https://donhopkins.medium.com/the-x-windows-disaster-128d398...

>If the designers of X-Windows built cars, there would be no fewer than five steering wheels hidden about the cockpit, none of which followed the same principles — but you’d be able to shift gears with your car stereo. Useful feature, that. - Marus J. Ranum, Digital Equipment Corporation


> Who cares that getting it to run on half of the roads (NVIDIA) is still not a solved problem because they stripped out any abstraction

You mean under “they” the linux kernel devs? Because it has absolutely nothing to do with wayland. Nvidia cards’ proprietary drivers work with X because you are using a part-binary blob for X. Also, finally nvidia realized that they should goddamn support linux, so what all these resulted/will result in is better integration for people with nvidia cards.


FYI there are crowdfunding efforts to keep Xorg maintained https://news.ycombinator.com/item?id=29034479 . The recipient is the current X server maintainer so it's likely this is the best way to help keeping Xorg maintained.


Cool. I will push the money their way. I believe in multi-user graphical Computing, and I'd hate to see it go.


It hasn't gone, it has just moved to other places, i.e. the web browser. In my opinion, X is a failed experiment, we know now that there are better ways to do things.


The web browser is not multi-user graphical computing.

And that is not a negative about web browsers or all the things we're doing with them. Just to be clear.


I would be interested to know why you think that, is this website not a multi-user graphical thing running on a computer?

Edit: Also I'm pretty confused as to why anyone refers to X as "multi-user", are you talking about multi-pointer X? That doesn't really have anything to do with X in particular and is also possible in Wayland.


Well, on an X window display, I could have:

Fonts from one machine Window management from another An application running on yet another, with it's program data sourced from yet another machine accessing data on still another machine. All of which is displayed on another machine also supplying user input.

Or, I could do something crazy like put a window on your display, with appropriate permission of course, and you could interact with it.

A big wall type display could take windows from a number of users.

The promise back then, and something I used a lot, an many still do use today, is being able to run something and display it somewhere else. Say my cellphone is on my desk. I could ask it to do something for me, and the window into that activity appears on the display like any other window does.

Another case might be several users running on one machine each with their own displays and inputs.

Here is a real world case:

High end CAD software, managed data, many users.

With X, one can make a big application server and that is not visible to users at all in terms of the application or data to be manipulated.

Users run the program via X, running X servers on anything they want. PC, Mac, Linux, whatever.

The only way to interact with the managed data is through the application.

One copy of the application, one data repository, many users.

With X doing that kind of thing is easy, and it works whether one user runs the app on their local machine attached to a shared data repository, or many users run on a remote machine perhaps that machine itself also holding the data repository.


All of that is extremely possible with a web app though, and most of it is probably done even easier with a web app. In fact that is the usual way to build a web app, make a server that does the heavy lifting and then make a GUI that runs in the client which can then be accessed by multiple users. You can easily access them from a smartphone too.

The only exception is this:

"Another case might be several users running on one machine each with their own displays and inputs."

This would be multi-seat which doesn't really have anything to do with the display server. It is implemented in udev and logind, which spawns additional X or Wayland servers for each additional "seat".


Seems you highlighted the differences nicely enough.


Is that sarcasm? A key difference I would say is that the web is actually better because you can run javascript or WASM code in the browser, in X that would be the equivalent of the "server"...


Not at all.


> Xorg is on life support, it has been getting bug fixes and nothing else for the past several years

I don't think life support is appropriate here. It still works well.


While it's supposed to be "maintenance only", it got a very important addition recently (but not a definitive solution by any means) that helps with dreaded multiscreen setups, AsyncFlipSecondaries.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: