Hacker News new | past | comments | ask | show | jobs | submit login

> When it works, it's smooth as butter and works great

Come on, Wayland was released 12 years ago! And it still has all kinds of problems that don't make it qualify even for beta software. I'm highly skeptical that Wayland will ever be as useful as X11 is right now.




I take it you don't remember X11 when it was 12 years old. Wayland looks quite good in comparison.


X was 1984. X11 was 1987. Twelve years later in 1999, it worked just fine.


to give you an idea of 1999, I was running Quake 3 on Linux with official Nvidia Riva TNT support and X11.

X11 also ran fine in 1995. Granted, I think fvwm2 was probably state of the art for window managers at the time. From '95 to almost '97 or '98 I stuck pretty much in 80x25 or 80x40 text console. Not because X11 didn't work, but more so because xterm really sucked and the only apps you would use in X11 anyway were Netscape, Gimp or xv (image viewer). Gtk+ also did not exist, so everything was ugly Motif or Tcl/Tk.

Oh, and... there was always talk of replacing X11. Even in the '90s. People have been talking about replacing X11 nearly as long as X11 has been around, sadly.


1995 the state of the art WM was probably GWM which was fully programmable in a Lisp dialect. You could even draw your window decorations in Lisp using various primitives. It was never widely used though.


Even before that. I remember installing X11 on PCs in 1995. And it worked much earlier than that on UNIX workstations.


Sorry, I meant X. Wrote X11 out of habit. But X11 is a version of X, so I still think the relevant date is 1984 and in 1996 I was still writing modelines in my X conf file... and at least with Wayland you don't have to worry about a misconfiguration frying your monitor.


Wayland (and X11, and any other modern display system) will absolutely fry non-multisync monitors. Those were common in 1996.

Hardware has just improved enough that we don't need to care.

This isn't wayland vs X11.

X11 is terrible for other reasons, but let's not give wayland credit for hardware improvement.


As another comment notes this is more due to EDID and multisync monitors: hardware and protocol improvements in consumer electronics.

X was actually okay in 1996 on the proprietary Unixes with their proprietary graphics card and proprietary monitors. (At least on HPUX, Solaris, and Irix). Linux and the *BSDs had some catching up to do primarily in hardware support and autoconfiguration. But they did so quite rapidly.


Then again, was there even an alternative back then?


I think, based on the responses to your comment, that your comment is quite wrong. It seems X11 was completely functional at the time, which Wayland is definitely not at this time.


It’s not that hard to understand... Wayland is a protocol. If you have a few badly implemented browsers, is the web protocol bad?

Also, it is questionable at best to state that Wayland compositors doesn’t work. Gnome is quite stable and with pipewire everything just works. Sway is similarly a stable software.


In practice I do not care about the difference between wayland compositors and the wayland protocol. When I say “wayland sucks” what I mean is that every time I have tried any “wayland” desktop or WM, it has not worked for my workflow and has tons of bugs and missing features. I do not care whether this is because of the protocol or some other reason.

Random example: in Sway, Chrome and all Electron apps have blurry text on displays with scaling other than 1.0. Every time I try “wayland” (or whatever term you would prefer I use), there is some show stopping bug like this.

I feel like the conversations here usually go like this, with people just talking past each other:

A: “X11 is going away, time to switch to Wayland!”

B: “Okay, but last time I tried it, it didn’t work for me because <list of reasons>”.

A: “No no no, you don’t understand, that’s not Wayland’s fault, it’s a compositor issue.”

B: “Well, whatever the issue is, I’d like to keep using X.”

A: “But X is going away! It’s deprecated!”


> Random example: in Sway, Chrome and all Electron apps have blurry text on displays with scaling other than 1.0.

That’s literally because of X. Both of them run by default in XWayland, but chrome do have wayland support already (but has to be enabled), and electron also have since being built on chrome, but most versions out there are not built with that version I believe.

Also, noone says you SHOULD change. Feel free to use whatever you want. But saying wayland suck, when you didn’t even use goddamn wayland for evaluating it just “sucks”...


> That’s literally because of X

I don’t care — that was the whole point of my comment.

> Also, noone says you SHOULD change

Yes they do! People are saying all the time that X is deprecated and going away, and we need to switch to Wayland.

If other people want to use Wayland, it of course doesn’t bother me; I just hope it never becomes the standard and pushes out X, which works fine for me.


> Yes they do! People are saying all the time that X is deprecated and going away, and we need to switch to Wayland.

You can continue to run on eg. Linux 3.* if you so wish. And X is a stable software, it will continue to work reliably in the foreseeable future, even without active maintenance.


> It’s not that hard to understand... Wayland is a protocol. If you have a few badly implemented browsers, is the web protocol bad?

Did you read the comment I was responding too? It had already conflated the two. However, in the spirit of not being petty, I ignored the conflation and decided to address the real issue, that implementations of Wayland are not stable after 12 years, while X was actually stable after about 8.


The Wayland protocol was released 12 years ago. Compositors have only really been implementing it within the last few years.


In all fairness, my experience is that most of the issues are in the broader ecosystem at this point.

Wayland, itself, is just a protocol. The display servers are just the display servers. But then you have Gtk and Gnome, Qt and KDE, software like Firefox and Chrome, etc. All of those have to updated as well, and while Wayland the protocol, the display servers, and a lot of the suite of infrastructure around it is in pretty good shape, the toolkits and clients and so forth all still need a lot of work.

And you only need to look at the Python 3 transition to see how hard it is to shift a whole ecosystem.

So I'm willing to cut the Wayland guys some slack. But it does mean there's still a ways to go, yet, before things are ready for prime time.


The perspective that wayland is just a protocol and doesn't need to address real issues is why it has so many of them to start with.


That I completely agree with.

The folks involved in designing the Wayland protocol seemed to have shed a lot of key features in the name of simplicity or security, forgetting that the solution actually has to meet real, human user needs. Those needs include things like screenshots and screensharing, global hotkey binding, etc, etc.

To address that, we now have a range of extension protocols, which is creating fragmentation. The CSD-vs-SSD debate is a perfect example.

Things are slowly coalescing--PipeWire is maturing, for example, filling some key gaps--but it's taking time and meanwhile IMO the ecosystem continues to be too immature for broad adoption.


Screen sharing and screenshots weren't an afterthought. Weston had support for those for a long time, but gnome and kde decided to do something different, probably because they saw pipewire was maturing.

Allowing arbitrary clients to globally capture keys at will is impossible to do without opening up a hole for keyloggers. Maybe someone will figure out a good way to do this but I wouldn't hold my breath. You're better off writing a compositor extension to do what you want.

The CSD-vs-SSD debate isn't anything new, there were apps that used CSD before wayland, and there were X11 window managers that didn't draw any window decorations. There is fragmentation there but it's caused by the apps, you won't fix that one without rewriting all of them to have the same policy on decorations, probably that means redesigning all of them to use the same widget toolkit and designs.


I've heard all the excuses.

None of it changes the fact that Wayland was either intentionally or unintentionally designed to exclude extremely common software use cases, or worse, to make those use cases someone else's problem (e.g. screen sharing), thereby creating fragmentation in the ecosystem due to a lack of standardization.

After all, it's pretty rich to blame Gnome or KDE for "[deciding] to do something different" when Wayland was very deliberately designed to offer no standard for how to do the thing in the first place.

I'd actually prefer it was unintentional, as that would imply simple oversight. If it was intentional, that implies deliberately bad design choices that have gotten us to the semi-broken place we are right now; a place we're only finally getting out of as other people (e.g. the PipeWire folks) step in to cover up the spike-filled holes that Wayland has left behind.


> None of it changes the fact that Wayland was either intentionally or unintentionally designed to exclude extremely common software use cases

The wayland protocol is multi-layered, with the core being deliberately only used for displaying content in recrangles properly. That’s it. But it also allows for querying the capabilities of the compositor with versioning, making extensions possible. There was a recent Show HN submission with this site: https://wayland.app/protocols/

The not-yet-core extensions doesn’t create fragmentation, actually there is a decent cooperation behind the 3-4 major compositor “backends” on everything, and ultimately they all settle on the same thing. Also, it’s a bit generous to say that fragmentation is somehow the fault of Wayland, when it has always been a problem in linux desktops.

Also, why do you think adding screen recording into wayland would have been great? It is a complex problem with audio syncing, not-necessarily display-related programs accessing streams and the like, so it seems relevant only on a surface level. Pipewire is the good layer to handle it. And prebaking some API without pipewire being ready would have been just stupid. It is/will be supported everywhere (there is a portal frontend already for gnome, sway and I believe plasma as well).


I'm pretty sure the goal is Linux with a single tightly coupled official ui with no extensions or themes with the single point of configurability will be an accent color.

This to be part of a monolithic identical system where different distros are only different in terms of default software installed and logo and the entire os is read only and not user serviceable.

Look up rethinking how we put systems together by leonart poetering and comments from gnome devs about disabling theming to improve brand awareness or arguing about the folly of letting users muck up their work with the horror of extensions.


These aren't excuses, there are no other parties at play here. Weston was the first implementation. GNOME and KDE were the next implementations. They could have chosen to copy Weston's implementation, thus making those parts "standard" but they didn't. That's the way it played out. The way you're talking about it makes it sound like there was some outside force designing Wayland and convincing the other designers to exclude screen sharing, when it wasn't like that at all. You also seem to be suggesting that they could have designed everything in hindsight perfectly before the implementations even existed, I hope I don't need to point out why it doesn't work like that.

You're also talking as if Pipewire is some outside thing that was developed in reaction to Wayland, when as far as I know, the plan with those implementations was always to delegate some tasks to Pipewire. The fragmentation here is because it's taken a lot of effort to redesign these core components. Ideally this would all be done already, but it takes time.


> You're also talking as if Pipewire is some outside thing that was developed in reaction to Wayland, when as far as I know, the plan with those implementations was always to delegate some tasks to Pipewire.

Given that Wayland was first released in 2008 and the first commit to the PipeWire project was in 2015, I'm quite confident at this point that you're rewriting history to support your position.


You misunderstand, it was only Weston that was started in 2008, and it had screenshots back then. I'm talking about those other implementations, they didn't really stabilize and start aiming to have feature parity until a few years ago, and they decided not to copy Weston's screenshooting mechanism.


Weston being the reference implementation for the protocol, my point stands and you're now picking nits.


I don't see how I am, and I don't understand your point. The reference implementation had screenshots. The other implementors decided not to copy that and did their own thing. What more could the Weston developers have done? They can't force the other implementations to write code that they weren't interested in writing.


Actually standardize the protocol and make the feature part of the spec instead of delegating the implementation to compositor extensions and effectively giving everyone permissions to do their own thing.

As they should've done with the many other features that are missing from the base protocol because some designer somewhere decided it was "beyond the scope of the project".

We even have a pattern for this in the way HTML5 was developed.

I swear, it's like the Wayland folks were absolutely hell-bent on repeating the mistakes of the browser world circa 2000. The only question, now, is which project will end up the IE5 of the Wayland compositor world...


Any implementor always has permission to do their own thing, that's the point of making a second implementation. Putting something in a spec somewhere doesn't make it mandatory or guarantee it will be implemented. They could have put the weston screenshoot protocol that was created in 2008 in the spec, but they didn't do it, probably because the other implementors said it wasn't good enough and they didn't want to implement it. So what more could they have done? The mailing lists around that time had a lot of suggestions that went nowhere. Trying to put pressure on open source developers to implement something they don't want to do doesn't work, unless you are their boss paying them a salary.

I'm being serious here, I legitimately don't understand what you're pointing out. Yeah I too wish everything I was planning on 13 years ago turned out perfectly, things don't work like that though. And if you ask me, the thing that's most comparable to IE5 is the Xorg server.


> Putting something in a spec somewhere doesn't make it mandatory or guarantee it will be implemented. They could have put the weston screenshoot protocol that was created in 2008 in the spec, but they didn't do it, probably because the other implementors said it wasn't good enough and they didn't want to implement it.

Amazing how nothing is ever the fault of the people leading the Wayland project.

> Any implementor always has permission to do their own thing, that's the point of making a second implementation. Putting something in a spec somewhere doesn't make it mandatory or guarantee it will be implemented.

Ahh, now I've got it!

So what you're saying is that, in essence, since your claim is no one follows it, one must conclude that in fact there is no spec!

And given that everyone I've come across who's involved with Wayland has said "Wayland is just a protocol", and given protocols are defined by specs, if the spec doesn't exist, then neither does Wayland!

Neo would be proud.

> I'm being serious here, I legitimately don't understand what you're pointing out.

I can't think of anything that more succinctly describes what's wrong with how Wayland has been developed over the last 13 years.

Well, except there is no spec, so I guess nothing was developed at all? I dunno...


I mean, no, it's not the fault of Weston developers that other implementors decided do their own thing. I asked this before but what could they have done? Putting tons and tons of things in the spec wouldn't really have fixed the real problem, which is that the way they wanted things didn't exist at that time, and the only way forward for them was to write their own implementation. The spec is only meaningful if you can get other people to promise to implement it in the way it's supposed to be implemented. It's true that Wayland is just a protocol but that protocol is also defined significantly by its implementations.

>I can't think of anything that more succinctly illustrates what's wrong with how Wayland has been developed over the last 13 years.

I don't understand why and I wish you wouldn't do this, this is leaning into flame war territory. If you can explain your point to me in a way I understand, then I'm ready to listen.


> The spec is only meaningful if you can get other people to promise to implement it in the way it's supposed to be implemented.

The entire point of a spec is to help drive interoperable implementations. If that's not the goal, then it has no purpose and it might as well not exist.

The core of Wayland is supposedly a standardized protocol and these various projects seemed to do just fine implementing against that core spec. There's a reason I can run a Qt Wayland app on Mutter or vice versa.

Evolution and development of that spec can be done in a collaborative way that takes into account the various needs of those projects, such that the standard can evolve in a way that furthers the whole ecosystem.

That the Wayland folks instead throw up their hands and just say "write a compositor extension" demonstrates their unwillingness to do the actual hard work of building an ecosystem, which is creating consensus and driving adoption of common features.

Is this hard?

Yes.

What they are doing is hard, and it's deeply naive bordering on irresponsible to engage in a project of rebuilding the entire display server ecosystem without recognizing the need for coordination and diplomacy across the open source world.

I look at the history of this and all I can think is that this is a group of people who have failed to learn the lessons of the past. Groups like the X11 and HTML5 standards bodies, the IETF, and so many more have demonstrated how to build a consensus-oriented specification that enables and encourages interoperability. Yet, to hear you speak of this, that must be a figment of my imagination because apparently that's impossible.


Everything you're saying is... mostly what's been happening? It's not impossible to have consensus. You're missing my point which is that people were trying various solutions since 2008, and there just wasn't any consensus on this particular feature until a few years ago. There's no reason to put it in a spec if there's no consensus. It turned out that the consensus was to not put this in a Wayland protocol, and to do it somewhere else. So it would have probably been a mistake if someone tried to force this through before then.

If you ask me, people only notice the ones where it takes a while to reach agreement. Just look at the PR we're commenting on, it took nvidia several years to come around and implement dma-bufs. Sucks but it happens. No one ever seems to pay attention to all the other areas over the years where there was consensus.


If that's true, and consensus is only starting to come together now, how is the Wayland ecosystem considered ready for mainstream usage?

From the perspective of someone happily using X11 at the moment, Wayland (or whatever your preferred term for "the loose association of compositors, protocols, extensions, and nonstandard hacks making up the Wayland ecosystem" is) looks like a failed attempt at building an ecosystem with proponents who are now trying to push it on everyone else in an effort to get the rest of the open-source community to solve the problems they created.

Every compositor is doing their own thing, application and framework developers need to implement basic functionality in one of several different ways depending on which DEs/compositors/WMs they want to support, some stuff has no replacement at all, and we're going to have to throw out the entire X11 world in exchange for... smooth DPI scaling and vsync? Really?

I honestly want to switch to Wayland - some of the stuff I've read about the X11 codebase is terrifying - but the cost of doing that, throwing out the entire desktop world, and giving up legitimate use-cases as "you shouldn't want to do that" is just too high, and the benefits are minimal. I'd honestly be happy to switch, but the whole ecosystem feels like it's a decade or two from being ready to go.

A lot of the hate Wayland gets stems, in my view, from the way it's been pushed on people. Users who aren't invested in the ecosystem and just see people pressuring them to switch to a loose collection of half-finished software that doesn't properly replace what they already have.


I completely disagree with everything in your comment. Wayland is an attempt by some developers to fix some longstanding issues with X11. They know what the new issues are and there is active work being done in preserve back compatibility and preventing things from breaking, e.g. XWayland. I've been using it for a few years, with no issues. I think it was bad up until around 2017-2018, that's when the major implementations started stabilizing and when consensus really started happening.


> It's true that Wayland is just a protocol but that protocol is also defined significantly by its implementations.

A protocol should never, never, ever be defined in any way by its implementations. The entire purpose of a protocol is to abstract away the common interface such that it is entirely implementation-agnostic.

Indeed, you might say that a protocol prescribes exactly the intersection of all implementations.


>a protocol prescribes exactly the intersection of all implementations.

That's a better way to put it and that's more what I was getting at.


One of the major complaints thrown at X11 in the 90s and early 00s was the inconsistent mismatch of UI conventions and behaviours. GNOME and KDE were still at interesting-novelty status, you had OpenLook and Motif apps on the distribution CD with distinctive styles, and every so often you'd load a libXaw program where all the scrollbars were weird and you suddenly used the right mouse button in strange and exotic ways.

How did they manage to get through the project without addressing this point, and make it even worse, by offloading stuff like "screenshots" that was taken as a given to the nonstandardized compositor layer?

I'd also have wanted to see much more of a "one true widget library"-- so Wayland!GTK or Wayland!Qt are just thin wrappers on top, which would ensure you get any native theming or customizability/accessibility tweaks cross all your software for free.


> Allowing arbitrary clients to globally capture keys at will is impossible to do without opening up a hole for keyloggers

https://www.x.org/wiki/Development/Documentation/Security/

The X.Org Foundation released 7.2.0 (aka X11R7.2) on February 15th, 2007.

The Wayland devs themselves say "It's entirely possible to incorporate the buffer exchange and update models that Wayland is built on into X." https://wayland.freedesktop.org/faq.html#heading_toc_j_5

They just didn't want to.


XACE is basically an extension of SELinux into userspace for X11. Given that today Fedora/RHEL are the only distros that enable SELinux out of the box, such an approach would have been doomed to failure (or doomed to provide a product differentiator for RHEL in the best case) - not to mention the sheer joy and excitement that debugging SELinux AVC denials would produce for end users and desktop application developers.

https://www.freedesktop.org/software/XDevConf/x-security-wal...

(Wayland's "buffer exchange and update models" won't provide security on their own without any improvement on the input side)


> Allowing arbitrary clients to globally capture keys at will is impossible to do without opening up a hole for keyloggers.

Is this feature standard on every OS in use by human kind today? Is this feature requested by most users and not having it pisses off most users? Is this feature available on the system that Wayland aims to replace?


All of those questions are really irrelevant. The point with Wayland is to make something that's more secure than the systems it aims to replace, not to make exactly the same thing. If you want to help, maybe show a way this can be done without poking a huge security hole?

Regarding pissed off users, in my experience most computer users are used to Microsoft Windows and get pissed off when things don't work exactly like that, so unless you work for Microsoft then it's a lost cause trying to please them all.


I'm working with i3 (on X, obviously). Perhaps I'm diluted but my experience is that users of this setup are happy about their setup and love the system they use.

I don't know of any way an Ethernet device can be truely secured, perhaps it should be disabled until a solution is found. I couldn't care less about a system that is more secure if it massively interfere with day to day usability.


There's plenty of ways to securely multiplex Ethernet devices. You don't give your web server read access to all TCP ports being used by other services for example. You only let it open the HTTP ports. There are of course ways for debuggers to escape this and intercept all TCP packets sent by the kernel, but those require elevated privileges.


Sway is in top-up form and is basically just i3 on wayland. Even the config is largely compatible, so do give it a try, if you haven’t (recently)!


I tried it not long ago. It's ok, took me some time to overcome some issues but got it to work, then, when I started actually working I met the same issues regarding screenshots, remote control/screen sharing and decided to stay with i3 while I can.


If you don't have any users you will soon find yourself without developers so it is in fact quite relevant.


That seems like the reverse of the way it's always been on Linux, where there are only developers and no users...


> Maybe someone will figure out a good way to do this but I wouldn't hold my breath.

Android and iOS and Browsers these days already have figured out something for those kind of things - they ask for permission.


The usual place those permission dialogs have gone is the flatpak portal, which is separate from Wayland. Someone would have to implement it there.


So I can't log my own keystrokes on Wayland? Anti feature.


If you have root access to your own system, or read access to /dev/input, then it's trivial to deploy a keylogger. No need for Wayland to get involved there. You can't log keys through the Wayland socket because that socket is intended to be accessible to sandboxed programs.


So to be clear you want such a program to be implemented to be running as root in a way that the user can't individually control its just another root daemon that the user may or may not be aware of.


craftinator wanted a way to log his/her own keyboard events.

Of course a custom extension can also be created for this use-case, in which way the compositor can allow/deny access for keypresses to specific windows. But to be honest a central hot-key manager where apps can register new bindings (and with compositor managed multiple bindings resolution instead of the ad hoc way under X) would be a great target.


That's one way to do it, it's probably not the best way but it would work if you were planning to sidestep Wayland security completely. You could make it user controllable with dbus and then make a GUI to talk to it.


> Wayland, itself, is just a protocol

That is the root of the problems IMO. The thing is, this design decision had made the ecosystem a lot harder to grow than X11/xorg.


Maybe it has made is slower to grow, but harder? Now we have several dominant Wayland implementations including wlroots, Mutter, and KWin (the latter is still getting there from what I've heard). This is already 3x the number of dominant implementations of X11. Compositor developers can choose to use an existing implementation that suits their needs, or to develop one from scratch (that is compatible by implementing the same Wayland protocol). I think making Wayland just a protocol has made it easier to grow.


Dominant compositor implementations – the dominant wayland implementation is libwayland :) About the only serious alternative is the Rust one – the crate has a native mode. (but you still have to use the libwayland backend if you want to do GL because of how Mesa's EGL impl is tied to… yeah, lol, hard problems here :D)

But seriously, the "competition" between these compositors is awesome. This is like web standards again. Testing your client across different compositors reveals bugs (either in them, or in your client) and everything evolves together in standard ways. With the single Xorg server, everything got completely ossified, if anyone wanted to rewrite a full "production ready" X11 server they'd have to be bug-compatible with Xorg.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: