I love this kind of deep dive into customizing the software/OS on a device you own. Glad that "Tivoization" isn't a concern for the steamdeck.
The most interesting part of the article was the mention of a /nix partition, as I didn't realize the steamdeck supports nixpkgs, after researching it more, they do indeed (not installed by default, but at least it is possible without having to fork an entire os to get it on the device).
Not if your / filesystem lacks the /nix directory to use as a mountpoint, and happens to be a read-only image. Something that needs a workaround on macOS.
What a thorough and interesting post. I would personally never do something like this. The most tinkering I've ever done with Linux was in my RaspberryPi era and that's 1% at most. So props to the author
I was in a similar situation as the author: for quite a while I had to build my own Redhat kernel for a very obscure case: by pass RMRR check to pass GPU to a windows VM. (similar to https://github.com/kiler129/relax-intel-rmrr ; not my repo)
The root issue can only be addressed by ROM updates from the manufacturer but I'm running an old DL360 that's no longer supported by HPE.
The patch itself is only one line change but updating the kernel is a pain since I have to :
- get SRPM (there's no git repo)
- unpack SRPM, apply patch
- rebuild and install
There already are distributions based around elements of SteamOS, geared towards PCs and controller-based usage. ChimeraOS works for me quite flawlessly, including Steam Deck add-ons, like EmuDeck.
I actually just ordered a GPU for my unRaid NAS server just to be able to do Steam Headless via a nice docker image(1) and then use Moonlight (for example) as a client on my Windows laptop. If it works, it's much better than buying yet another piece of desktop hardware just to play games when my NAS is just sitting there idle most of the time. Just need to make sure I keep the power level setting on the Nvidia card to idle when not in use (hopefully a nvidia-persistenced call will do it).
I spent some (too much) time trying to get pretty much the same thing running using GOW [1]. Was quite a bit harder than I thought, requiring a hdmi dummy plug to get the xserver config right etc.
Oh nice. I've been day dreaming of setting up a server with turn based, hot seat enabled games (like Civilization) and a browser based way to remote into them so that friends and I can play long turn games from anywhere at any time.
Interesting read! The A/B upgrade sounds a bit overkill, you can always just pop up a live distro or install a recovery system (on an old version) in a partition in case something goes wrong.
I recently moved to Arch after a few years of NixOS (preceded by years of Arch) and I think the fears of the author are misplaced.
Arch is definitely a very serious and mature distro and I'd trust them more than Valve.
The quality of the packages available for Arch is what made me move from NixOS.
The main repos are updated really fast and AUR has a lot of useful packages.
> The A/B upgrade sounds a bit overkill, you can always just pop up a live distro or install a recovery system (on an old version) in a partition in case something goes wrong.
You and I can, the overwhelming majority of computer users cannot. Valve clearly focuses on building for the average person, something that Linux distributions (as much as I love them) still don’t really do (well).
The system automatically recovering from a failed upgrade is essential in a low-maintenance OS at this point.
> The A/B upgrade sounds a bit overkill, you can always just pop up a live distro or install a recovery system (on an old version) in a partition in case something goes wrong.
Could, sure, but we have the technology to make it unnecessary and disk isn't that expensive, so why not?
I recently got my hands on a gaming handheld (the Legion Go) and have used it to get more exposure to Linux. I'd historically avoided it, because it seemed like a perpetual tinker timesink with limited compatibility with things I'd actually want to use. Reading about immutable filesystems and how traditional Linux gives root willy-nilly to all sorts of random software piqued my curiosity.
I'm using NixOS, which can indeed be a tinker timesink, but is good for exploration. You can easily try different components, and then completely remove them (aside from some ~/.config pollution) if you don't want to keep them. It's also trivial to patch things before you install them (such as adding some kernel patches to make Linux usable on esoteric hardware like a gaming handheld).
There's a NixOS community called Jovian that's reconstructing Valve's random SteamOS tarballs into tagged commits on GitHub, which you can browse as if you were a Valve employee. They've made it so you can install your own copy of SteamOS atop NixOS by adding a few lines to your Nix configuration. They're clearly Linux experts, and you can see from the source that you're getting Valve's packages unadulterated, save for simple adaptations like introspecting instead of hardcoding the power button location.
So, if you want a pure SteamOS experience without hosting your own mirror of Valve's update system (or if you want to be able to browse Valve's source without downloading a 3GB tarball), give Jovian a try.
> Bazzite is an OCI image that serves as an alternative operating system for the Steam Deck, and a ready-to-game SteamOS-like for desktop computers, living room home theater PCs, and numerous other handheld PCs.
Worth visiting the readme even if not interested. There's a huge list of included stuff, and a lot of it seems really cool.and helpful (for gamers or streamers mostly).
Bazzite (and Immutable Linux as a whole) is fascinating.
I'm not deep enough in their weeds to perfectly explain it in a concise HN comment, but it's all about having a read-only known-good Linux distro at the root and then layering packages on top, taking much inspiration from server-side containers. It's supposed to be both more secure and more reliable/reproducible/customizable than traditional Linux. You just write in a container manifest which packages you want. When an upgrade comes out, it runs the upgrade, then reinstalls your packages on top.
Even more relevant is that you can "fork" Bazzite relatively simply and add any missing packages or configuration you need to your own custom image and let GitHub actions do most of the infra work for you
2023 was the first year I gamed exclusively on Linux according to Steam's year in review, including some of this year's titles. Most of that was on the Steam Deck or on a virtual machine with GPU passthrough running Bazzite. It is really well made.
Valve might need some not yet upstreamed kernel features for Steam Deck, but what is the ustpream kernel missing otherwise for gaming? I use it without any issues.
As far as I know they also prefer to upstream things in general. I think AMD's amd-pstate / amd-pstate-epp and related work was kicked off becasue of Steam Deck, but it all went upstream.
> No. Not even questionable. If you have an NVIDIA GPU, You're on your own. Latest Valve updates for Steam client including normal and Jupiter bootstraps have broken gamepadui on NVIDIA GPUs, and if so, no support will be provided for you.
Bummer. This rules out 76% of steam users, according to their hardware surveys.
NVidia on Linux is an unholy mess, and always has been (at the very least since 2004, which is my earliest memory of fighting it). It's true even on NVidia's own SoC (Jetson).
It almost feels like they're trying hard to make the experience worse for everyone: users, OS developers, app developers, hardware developers... I don't know what to make of it, if you want NVidia you should pick an OS other than Linux (I've heard FreeBSD actually works fine), if you want Linux pick a GPU other than NVidia.
When projects like Sway only enable NVidia support behind a flag named --my-next-gpu-wont-be-nvidia, I have sincere doubts that a different gift wrapping will change anything.
That description is pretty hyperbolic. The SteamOS UI (eg. the Steam Deck-looking part) is very broken on Nvidia right now, but the actual gaming part (eg Proton and the Steam launcher) works fine. If you just want to play mouse-and-keyboard in desktop mode, recent Nvidia cards are generally pretty cooperative.
Well, I'm not smart enough to know if it's hyperbolic but it's a pretty damning statement right there in the README. Certainly enough to turn me away from ever trying it on one of my machines.
Tangential: anyone have experience with unity and/or unreal on Linux these days? Last I checked (2-3 years ago), they technically worked but we’re janky and buggy. Is it improved?
Unity is only a little more janky and buggy than it is on Windows.
I had a lot of trouble getting the unity editor working on my steam deck, but that may have been due to using an editor version from 2021 (for unrelated reasons). It seems to behave fine on a normal desktop environment though.
Unity has been quite solid for me on Linux lately. It’s mainly just minor annoyances like the project settings window being too small when you open it so you have to resize it, little stuff like that. Nothing that has prevented me from getting the job done. I still prefer to use it on Linux because the glitches annoy me less than, well, using windows.
Unreal works okay for me, but I’ve had to submit a few patches upstream to work out some Wayland issues. Other than that, it’s about as bloated/buggy/slow as it is on windows. Most of the time if I think there’s some Linux-specific issue I’ll open the same project on windows only to discover it was the same.
Quite good actually when you use the forward+ renderer (vulkan) and do the little things properly (using multimeshinstances or shaders heavily instead of tons of direct meshes, using occlusion culling properly, making sure you have multi-threading enabled, run physics on a sep proc, etc). I'm evening running with full dynamic global illum and loving it.
I'm not trying to prematurely optimize but my game worlds are very large, (I have a working to scale earth for example) (oops, forgot to mention, to get good large world behavior, compiling with double precision is a must! (https://godotengine.org/article/emulating-double-precision-g...)) so I've been watching lots of GDC and other talks from AAA titles and how they do their rendering pipelines, and my conclusion is that shaders are really the powerhouse of gamedev these days and anyone interested would do themselves a favor to start learning the quirks of your engines shader language now.
I was totally blown away by how good Proton is in the post Steam Deck world. I now play Steam games on my Linux laptop almost daily because they “just work” even when the only listed supported platform is Windows
After hearing people be ecstatic, I thought I’d go full-in on Linux gaming. I have a pretty bog-standard gaming PC that is very Linux-compatible (Intel i5 + Radeon 6800XT) and on there Apex Legends has horrid frame pacing issues, Mirror’s Edge doesn’t work with wireless Xbox controllers. You lose out on a lot of GPU suite features that Windows has. Gnome doesn’t support VRR. Etc etc.
There’s so many small issues it’s held me back from deleting my Windows partition. Maybe in a year or two?
That said, these things work flawlessly on the Deck.
Yeah, except I prefer the cleanliness of Gnome over how scattershot and buggy KDE feels, so I’m SoL. I’ve even looked into launching games into their own little Gamescope instance, but if you don’t run Gamescope as your main window manager, you lose most of its benefits.
> For Mirror's Edge, were you using Steam Input?
Yes. The problem lies in the fact that only the Xone driver properly supports the Xbox wireless adapter, but it doesn’t play nice with Mirror’s Edge. Xpad and XpadNeo do work, but those require USB or Bluetooth.
And me having to tweak a million things tells why gaming on Linux still sucks, aside from Deck’s blessed config. I don’t want to deal with a thousand papercuts, I want to boot my system and play. Windows is still closer to that experience than Linux.
> Yeah, except I prefer the cleanliness of Gnome over how scattershot and buggy KDE feels
But if it's the difference between gaming working or not for you, wouldn't you rather use it? Surely you barely interact with it anyway while gaming, only to get into Steam?
If this is a machine you use for something else too, you could just have a gaming user that logs in to KDE and your normal user that uses Gnome?
> Surely you barely interact with it anyway while gaming, only to get into Steam?
I'm a Linux desktop user and I drop into a game once in a while while I'm waiting for another meeting or waiting for a build to finish or whatever. My work desktop doesn't use VRR (the just-for-games PC uses Windows), otherwise I'd be in the same boat as 'jorvi because it quite matters to me that games on my desktop integrate into everything else at a passable level. For me, GNOME does a better job of integrating my different activities than KDE (which wasn't always the case! I was a KDE3 user for a long time!), so I use GNOME. And it remains an unsolved pain in the ass that the Linux desktop experience isn't coherent enough to mean that we should only be thinking about desktop environments if we want to.
Coherent, holistic switching between tasks is a thing that people are allowed to want and attempting to convince people that they don't is a bad look.
> If this is a machine you use for something else too, you could just have a gaming user that logs in to KDE and your normal user that uses Gnome?
This is a really sad observation on the state of the Linux desktop. Still.
>> If this is a machine you use for something else too, you could just have a gaming user that logs in to KDE and your normal user that uses Gnome?
> This is a really sad observation on the state of the Linux desktop. Still.
It seems like a somewhat odd observation, is it really necessary to have another user to do this? I can easily switch between Gnome, i3, and Sway on my system, I mean that’s going between X and Wayland, no issues… maybe KDE and Gnome have some specific incompatibility though? Odd.
Anyway, at least there’s a workaround. If Gnome is a hard requirement, how is Windows even a candidate?
> maybe KDE and Gnome have some specific incompatibility though
It's a layer down from the DE itself, it's the window manager beneath it. GNOME ships Mutter and KDE ships KWin. GNOME is pretty tightly tied to Mutter; KDE is less tied to KWin, but KWin also tends to support shinier features than Mutter does anyway so I don't know why you wouldn't use it anyway.
> It seems like a somewhat odd observation, is it really necessary to have another user to do this?
Strictly no, but having to have another login session, period, is bonkers to me. It's reasonable to respond to that suggestion with incredulity.
> If Gnome is a hard requirement, how is Windows even a candidate?
For me, it's not. At the moment it's inertia, because Windows has legit become the best Linux dev environment I know of with WSL2. I originally switched back to a Linux desktop because I was working on some hardware stuff that benefited from being on a Linux platform, but I'm certainly not tied to it past that.
> It seems like a somewhat odd observation, is it really necessary to have another user to do this? I can easily switch between Gnome, i3, and Sway on my system
Ok, sure, it was just the first solution that came to mind. On mine logging in launches straight into sway. I think only in the first session (to allow recovery in case of some related issue) so I suppose I could switch tty and then manually launch whatever DE.
But to me I think the odd observation is that it doesn't need to be a different user if we're talking about still having to log in again anyway.
Unless you mean some kind of session saving, swaymsg exit, and then launch the other one? But then you have to maintain whatever session saving (probably different in each) solution and what have you really gained.
Honestly it's the reverse for me, but I guess that's down to personal preference. "Gnome" apps keep updating with the "new" GTK style, which means the title bar becomes a conglomeration of a bunch of weird controls, the familiar dropdown menus vanish, everything gets moved into a tiny little hamburger menu and, often, the layout breaks in subtle ways.
The calculator app just recently did this, and now I have to type and enter one line of numbers before the text control realizes it's too small and resizes itself. That first line of numbers is nearly invisible. Happens again every time it's opened.
I'm not sure who decided that desktop apps need to look and feel like touchscreen-first mobile apps, but I don't particularly like it. KDE still feels like a desktop environment, so it's my strong preference. I'll put up with a very slightly less polished experience if it means stuff stops rearranging itself just for the sake of change every couple of weeks.
(Aside from KDE, Cinnamon is pretty solid and less feature packed, maybe give it a whirl?)
Hamburger menus are among my greatest gripes with GNOME. In apps with any functionality at all they end up being poorly organized junk drawers filled with odds and ends, and because they have to be somewhat short to be effective, functions that don’t fit in them either get buried or cut.
What makes this all worse is that GNOME has acres of space reserved at the top of the screen with its statusbar, most of which is empty and doing absolutely nothing. It could house a macOS-style global menubar (as Unity did for fullscreened windows) with room to spare… Though global menubars aren’t everybody’s cup of tea I think many would agree they’re better than the alternative of oversimplified hamburger menus, and they would help achieve the clean look GNOME is going for without so dramatically impeding functionality.
The calculator app just recently did this, and now I have to type and enter one line of numbers before the text control realizes it's too small and resizes itself. That first line of numbers is nearly invisible. Happens again every time it's opened.
OT, but I recently started using a Python REPL as a calculator, leaving it open full time in a window. It's pretty great. Haven't touched an actual calculator, or a calculator app, in weeks.
>Yeah, except I prefer the cleanliness of Gnome over how scattershot and buggy KDE feels, so I’m SoL.
You know that linux distros are multiusers/multiseats systems right? You can perfectly use Gnome as default desktop and live switch to a dedicated gaming user with kde plasma desktop that is only used to launch games.
Kde plasma shoudln't be buggy if only used as a game launcher and disable baloo file indexing if you want to limit kde memory usage to minimum.
Windows users always find a reason not to switch to Linux because some missing feature. In two years? There will be another new feature or game on Windows. I remember people insisting on using Windows because it support their „3D-Shutter glasses“ or their card from Nvidia.
Either you want use Linux or not :)
Why are many features initially only available on Windows?
First. That is wrong. Important features like cgroups, namespace and containers/Flatpak where novelly developed upon Linux.
Second? MBAs only look at past numbers. So Windows often get traditional Windows stuff first. You make guess it, innovative companies care about what will be possible in future. Valve for example.
The MBA style thinking is also in many consumers. Still buying Nvidia? Because they were faster in the paper sheet? I prefer the cards which works well with Linux, so AMD or Intel. Frames actually generated are more worth than problems with proprietary drivers.
PS: Linux has maybe won the war against drives. Seems like Nvidia open most stuff slowly and feature land in the nouveau-module or mesa. A decade to late. I’m already in Team AMD ;)
You're using the phrase "MBA thinking" to mean "making decisions based on your personal use case and identifying solutions which match".
I'm not sure how this is a bad thing. I don't run Linux to run games because Windows is a better supported platform for running games. I'm not "looking at past numbers", I'm looking at the situation in front of me as it exists, setting aside my personal feeling on what might have been and instead focusing on what actually exists, today, for the problem I am looking to solve today.
Why don’t people like Linux? Because it takes 8 bloody commands to do something as simple as add a new drive whereas Windows you can just open disk utility and format. I bounced off Linux a few weeks ago over this. It’s for people who want to tinker more than actually use the system.
That's untrue though. Linux has a disk utility (I use gparted personally). And you can surely do it on the command line in a single command.
On Linux you could automate that task. How would you propose automating "open disk utility and click a few buttons" on Windows?
This is less of a "Linux can't" issue and more of a "I quickly know how to do it on Windows after years of experience and I don't know how to do it on Linux." Linux not being identical to Windows isn't a flaw. No one blames you for not wanting to relearn, but pretending like Linux is bad because your Windows muscle memory doesn't apply is nonsense.
When I Googled how to complete this task I came across multiple results all of which suggesting to use a string of CLI commands. GParted was suggested in some of the results but it wasn’t installed by default on the distro I was recommended (Lubuntu) so I had to punch in even more commands to get it installed. Then after creating the partition it was still unusable until I mounted the drive (which wasn’t clear until after Googling why I can’t use it). Mounting required yet more commands. I did a cursory glance at the GUI buttons on GParted and didn’t see a simple mounting option. If you can’t mount in GParted then my claim still stands that it’s much more effort, and obscure, than Windows which automatically “mounts” the drive so to speak, when you create the partition.
> GParted was suggested in some of the results but it wasn’t installed by default on the distro I was recommended (Lubuntu
You got a less than stellar recommendation based on your desire for parity with ease of use with windows. Lubuntu is a more niche distro aimed at lower resource usage at the expense of the ease of use you are looking for.
If you had installed KDE, you'd likely have explored the start menu and found gparted or typed 'disk' into search and found gparted.
I swear the biggest problem with linux is the nerds pushing newbies towards esoteric garbage distros instead of established and widely supported ones like straight up Ubuntu with Gnome.
The biggest problem with linux is definitely finding the right distro. Ubuntu is awful. With their move towards "snaps everything" it just keeps getting worse. Canonical is basically Linux's Microsoft. A lot of the "established" stuff on Ubuntu is just duct tape that actually makes it worse overall. Ubuntu might have an easier getting started experience, but it's not a good long term experience.
They're generally worse than debs, and worse than alternative "all in one" package formats like flatpak or appimage. And when coupled with "not packaged by the original devs" leads to issues where people have issues and then raise bug reports to the wrong people, but beginners don't know that the snap is packaged entirely by Canonical, and not Valve, and just results in a poorer experience.
There's still other things that are absurd to do under Linux. Like turning off write caching for removable drives. Unless something has changed in the past couple years, you need to either manually edit fstab per-drive or setup udev rules.
(Not write caching for removable drives should just be the default. Windows hasn't used write caching on removable drives for 20 years. It also presents a toggle in the drive properties if you really do want it on though.)
On Gnome, you open the disk manager. You click format.
In fact a lot of things are easier. On windows, you need a third party tool to install an iso onto a disk. On Gnome, you open disk manager, right click disk, click restore from image.
I was recommended a distro that doesn’t use Gnome (Lubuntu). The system I was working with is very old and some light research made it seem like Gnome is pretty resource heavy.
How old are we talking about? 10+ years ago I was running Gnome3 on decent hardware of the time, and everything was snappy[0]. Now all the OS software got faster since then, so everything is still snappy on that thing despite that hardware now being old. Similarly that laptop came with Windows 7 and that was snappy, and the Windows 2021 LTSC on it is also snappy[0].
0: I care about responsiveness, so I've always disabled animations on every device, so I have no experience if some animations can run at 60fps on some hardware and 30fps on others.
Looking back, that may have been my switching point: when setting up my distro of choice took less time then windows after a fresh install. Life without package managers, even now that there is chocolatey, is just unnecessary pain. And as a DE, windows had no edge over something like KDE.
An operating-system without a package-manager is not maintable.
It hard to install, update or remove software. The validation of the system isn’t possible. Aside from the missing security.
On Windows changing a setting has become a horrible task for me. Where is it? Why does the search doesn’t find it? Should I try to find the Win32-Dialog from Win2k or is this setting in WinUI3 the same? Why I have to sit in front of an installer wizard an „click through it“?
On Steam I can do a file check of a game and it verifies its integrity? My wish. All package-managers on Linux should provide that :)
>Windows users always find a reason not to switch to Linux because some missing feature.
Because the OS is a tool, not a religious/political statement.
Therefore I'll use it if it works the way I need it and it solves my problem, or not use it if it doesn't work the way I want it and ends up creating more problems for me than it solves. Simple.
>First. That is wrong. Important features like cgroups, namespace and containers/Flatpak where novelly developed upon Linux.
I get your overall point, but the first "process containers" code that later became cgroups was merged to the kernel in 2007. Windows came out with the Job Objects API in Windows 2000 (NT 5.0) in 2000.
IMO, the Job Objects API was not really suitable to use in production settings; it had many weird edge cases, so although it looked similar to cgroups it often broke in strange and unpredictable ways.
Steam Input is rapidly becoming the Google Play Services of the desktop linux world. On Steam Deck for a long time you couldn't even use the touchpads without the Steam client running.
Did you try out 'gamescope'? This is something you find on the Deck but not 'for free' with Steam on other Linux.
I find it helps with pacing. It also supports VRR with a commandline argument, '--adaptive-sync'.
VRR may need support in the environment to work, I'm not sure. Sway/wlroots does it fine. Presumably KDE does/can too since that's what the Deck uses in 'desktop' mode (otherwise, gamescope).
edit: I see in another post - you have! Agreed on KDE being scattershot. I hope the Gnome people clear things up for you. I wouldn't go so far as to suggest i3/Sway, even though I'm happy with them
Could you provide details on how you got gamescope working with sway? What is the full command line you used? I believe I ran into problems with it conflicting with XWayland or something like that.
Cannot relate much. My 5800x3D and 6800XT deliver an outstanding Linux gaming experience. I don't play EA games, though. I do play some fast paced shooters that don't need VRR since you can manually cap fps to your liking.
Also, it was my understanding that gnome has support for adaptive sync.
May i ask what driver features are you missing? I only want some decent fan control instead of relying on random scripts off github. AMD has to release some sort of GUI panel for sure.
> I only want some decent fan control instead of relying on random scripts off github. AMD has to release some sort of GUI panel for sure.
Have you tried CoreCtrl [0]?
> My 5800x3D and 6800XT deliver an outstanding Linux gaming experience.
I have a 7900XTX and performance under Linux has been at least on par with Windows, sometimes better (though not by much).
> May i ask what driver features are you missing?
I'm not GP but I'd love to see frame gen and stuff like anti-lag and upscaling integrated into amdgpu with some sort of official way of setting it (though looking at Adrenaline it might actually be best if it's left up to the community to create the GUIs).
Similar specs but run Windows here, part of the reason being that I noticed that the ray tracing performance is just awful on Linux compared to Windows. I found I get slightly better framerates in most games in Linux, but anything that uses raytracing goes from "just about usable with FSR" on Windows to "totally unplayable" on Linux.
I'm told it's better in Mesa 23.3 though, haven't tested.
> After hearing people be ecstatic, I thought I’d go full-in on Linux gaming. I have a pretty bog-standard gaming PC that is very Linux-compatible (Intel i5 + Radeon 6800XT) and on there Apex Legends has horrid frame pacing issues
Apex Legends run flawlessly for me, but only on KDE/X11 with Nvidia reflex enabled[0].
If you are on Radeon though, I bet the problem is your window manager. I have the frame pacing issues on:
- hyprland/wayland (even with no_direct_scanout = true; and floating game windows)
- KDE/wayland
I also had a weird issue using gamescope as my DM where apex got resized into a tiny frame in the top left that was like 200 pixels or so wide.
> That said, these things work flawlessly on the Deck.
Likely due to running into these graphics driver -> WM and similar compatibility issues and fixing them. The other performance improvements from kernel changes probably don't hurt either.
I’m not sure why people are trying to convince you; Linux is free so there really isn’t any benefit to us Linux users or to the Linux developers if you switch…
Valve should be the only one that is worried about your opinion here. I think they develop SteamOS as a backup plan, though, in case Microsoft ever starts to take their own App Store seriously.
That is surely part of the consideration, but certainly not all. Some engineers at Valve (especially the head honcho Gabe Newell) are legit Linux people (Debian IIRC). They believe in it, and I love them for it
I don't dispute your claims, but I remember very clearly that back then it seemed obvious that SteamOS was a response to the Microsoft Store and a fear that Microsoft would mandate that all software on Windows come from the Microsoft Store.
While that was obviously speculation, at least the dates match up (October 26, 2012 for Microsft Store launch and December 13, 2013 from SteamOS launch according to Wikipedia)
Agree based on my memory. I think the Microsoft store threat is what finally tipped the scale. It took it from "we kind of support linux because we like it" to "we support linux because it's important business insurance for us in case Microsoft goes Apple (or Xbox or whatever example you want) and monopolizes app distribution on Windows.
And it left them well-positioned for the steam deck, I wonder if they were thinking about that when they started steamOS, or if it is just an example of the natural advantage that openness gives you.
Anyway, agree—I wasn’t trying to belittle Valve’s motivations, just wanted to include a thought about why they seem to be happy serving both platforms.
Yep. DRM’d online stuff and VR mainstays (Beat Saber, primarily) are the two sets of games that are keeping me tethered to Windows at the moment. VR games can be played via a Windows VM with GPU passthrough but for DRM’d online games you don’t really have any other option, at least if you don’t want to get banned.
I also run a Windows VM for gaming. One thing to note is that some games have (robust!) VM detection checks on launch, so you can’t even run them in the first place. Valorant is one example.
What do you use for the VM? Last time I checked, I couldn't find any free/FOSS VM tooling that allows me to do GPU pass-through on a Linux Host to Windows Guest.
It seems like you haven't looked into it much since it's was feasible for last 7-8 years..
Linux hosts had GPU passthrough working well before commercial software had such options. Nowadays it's just work out-of-box with Virt-Manager that just run QEMU under KVM.
It's been working for years for 99.9% of games excluding some invasive anti-cheats that ban you for VMs, but there literally only a few games that have issue with virtualization.
I use Proxmox and GPU passthrough works just fine (via QEMU). Note that Nvidia GPUs have less issues with passthrough, at least last I checked. See this guide: https://pve.proxmox.com/wiki/PCI_Passthrough
But if you’re running a standard distro, there are guides for most them.
Last thing to note is that your motherboard can make the process easier if it has good IOMMU support. Basically, you want a MB that puts your PCI slot in a separate IOMMU group. You can find examples by searching for “(MB name) IOMMU groups”.
Do you mean the presence of anti-cheat software makes them anti-player? Because I’d disagree. It’s a lot of work and expense to combat cheats, but is very much appreciated by many players (when it works)
Or they could just not trust the clients, instead of throwing the problem over the wall. A lot of these games with fancy anti cheat protection the cheat tools basically just tell the server "spawn me a vehicle right here" and the server just does it. Garbage.
> A lot of these games with fancy anti cheat protection the cheat tools basically just tell the server "spawn me a vehicle right here" and the server just does it.
Citation needed. I'd be quite surprised if it were common for servers of professional games to trust the client in that sense (i.e. allowing it to decide game logic like what gets spawned where).
As far as I'm aware the most common types of multiplayer cheats are
* wall hacks, which you could probably prevent by not sending the client any information about objects that the player can't see, but that would require the server to calculate the line of sight for every player/object,
* and aim bots, which I don't think you could prevent at all on the server side since they don't rely on the bot having access to any information that the player isn't supposed to have. They just rely on the bot being better at aiming. I suppose if you did all rendering server side and only sent the rendered graphics to the client (i.e. streaming), that would make it harder for the bot because it'd now have to do image recognition to find the target, but that just makes it harder, not impossible. Plus, game streaming wasn't well received for a reason and anyway, I don't think that's what you had in mind when you talked about "not trusting the client".
Look up BF2. Cheat tools would just disable limits locally on ammo requests, vehicle requests, artillery strikes, and so on. Server didn't check anything. It had fancy anti-cheat tech. Which was bypassed by just writing and restoring executable memory changes faster than the anti cheat detected.
Things are certainly not always as professional as they appear to be.
Visibility test is definitely feasible against wallhacks, it's not that expensive.
Aimbot is an assist cheat, which technically does not violate the physical rules of the game, so you are right that it's more difficult to detect. One solution to detect this class of cheating is to record the player's movement, and rely on a combination of outlier scores and outlier movement behavior to detect abuse. It's not watertight, but neither are any of these client side anti-cheat detection schemes.
Wow, it’s awesome you’ve solved the entirety of multiplayer gaming. Here I was thinking anti-cheating measures was a complex topic but it’s great you’ve elucidated me.
"just" is (tongue in cheek) a forbidden word in HN. Next thing you might find yourself claiming is that Dropbox is a worthless idea because it's "just" FTP.
Btw tell me exactly how an aimbot that takes the visuals from the player's screen and tilts the player's cursor so (or not so) slightly towards identified moving targets, are to be avoided from the server. Modern cheating is already a hard-ass problem to solve, much more so if no client-level monitoring is desired.
> Btw tell me exactly how an aimbot that takes the visuals from the player's screen and tilts the player's cursor so (or not so) slightly towards identified moving targets, are to be avoided from the server. Modern cheating is already a hard-ass problem to solve, much more so if no client-level monitoring is desired.
The very same way that you'd do it on the client. If I run an aimbot on an nvidia jetson devkit, using HDMI in to get the screen image and USB emulation to send inputs, your anticheat has to do the same work regardless if it's on the client or the server.
I think that makes sense; but doing it on the client means that your computer has to do the work for you, thus distributing the load among all clients. Doing it on the server would mean that their machine has to do the work for all players.
If we complain about companies being too quick closing up their servers when games are not as successful as they hoped... imagine if those servers were x10 or more expensive, due to that kind of analysis for all players. Companies would be much quicker to pull the plug, I guess.
Dropbox is a worthless idea (long term) because it's not running on my own server. :')
And exactly. You cannot detect that with client-side anti-cheat nonsense either. Record on the HDMI and output a fake USB mouse, why not? Botting doesn't break the physical rules of the game, so you're right that it's hard to detect. One "solution" is to record player movement on the server and detect outliers in behavior and scores. Not perfect (and also very difficult), but just as unreliable as client-side anti cheat nonsense.
it's anti-player when it is security theatre, which in 95% of cases it is.
When I start a game and I see an Easy Anti-Cheat banner I think to myself "Great now I can be killed by an aimbot while simultaneously hosting a root-kit voluntarily."
Why do you think these systems are advertised like that, at the forefront of the game load? It's so that the developers create a false trust in the playerbase that they're doing their damnedest to prevent cheaters, when the reality is that they paid a small amount of cash to a third party to use a system that does a piss-poor job at everything aside from being a symbol of effort and adding incompatibilities where there shouldn't be.
eac bypassing is trivial to a laymen, that doesn't bode well as a defense against people that have made cheating their hobby.
and to be clear : I use EAC as the example because to me it symbolizes the 'security theatre' side of the effort. Real anti-cheat efforts exist, and those should be applauded. EAC ain't it, but it's the industry standard... worrisome.
I personally would far rather have the occasional cheater than have the game install literal rootkits. It's absolutely bonkers that people are willing to accept that.
There's nothing "esports" about wanting to avoid wallhacks/aimbots in games like Tarkov, Rust, or Destiny, which completely ruin the entire game for every player in the lobby in an instant. It has nothing to do with "esports" and everything to do with actually being able to play the game. Do you also think it's because of "esports" when you're forbidden from cheating at a game of chess in person? When my friend plays Rust and gets upset because a flying aimbot hacker raids his base, gets banned, and comes back 1 hour later (buying a hot key off some shady 3rd party site), is he thinking "Damn, esports is really ruining this game"? No. The players are expected to fundamentally abide by the same rules. That's what a game is.
Realistically these days with how expensive most of these games are to run and make, if you do not keep cheaters away it can tank the entire project, e.g. Cycle: The Frontier basically had to shut down because they couldn't keep cheaters at bay, in a system that heavily relies on player count to remain healthy and fun. Once the cheating gets bad enough, people stop playing the game, which leads to a death spiral: it starts with bad queue times, which leads to people playing other games, and that spiral further diminishes the playerbase beyond a point of no return. Cycle barely made it 12 months and the result was a multi-million dollar project getting flushed down the drain.
A kernel level invasion of privacy is required to stop flying players? That doesn't sound right to me. Not to mention that apparently it isn't working if your friend is witnessing it.
So players of those games are sacrificing privacy for no security at all by the sounds of it.
I am glad that Bungie is going with fog of war for Marathon. And heck, given the features Marathon is getting, maybe someday Destiny can have those nice things too. We'll see...
I assumed that cheating is way more widespread amongst multiplayer gamers? There is a lot less anonymity in esports and if you get caught and blacklisted.. well you just wasted thousands or tens of thousands of hours.
It's pretty hard to have fun when the server is full of cheaters.
> I assumed that cheating is way more widespread amongst multiplayer gamers?
I mean, hard to call cheating in a multiplayer game the same as cheating in a singleplayer game. The former ruins the experience of others, the latter just affects your own session. Hard to be against cheating in a singleplayer context.
I was thinking about casual and professional online gamers (yet somehow managed to leave out a word in comment...). Of course "cheating" in single player games isn't even a real thing
The Valorant community is incredibly in favor of the Vangard anti-cheat that loads as an early kernel mode driver, and the pro/pro-am Counter-Strike scene plays on FACEIT because they have a strong Kernel-based anticheat. VAC, and server-side VACnet just doesn't cut it.
The only thing incredible is how upset people are for pointing out that it’s hostile to tell someone the game they enjoy playing is garbage and is not worth playing because it has anticheat.
You are conflating ideas. I don’t think it will be a productive discussion to go down the road of anticheat systems and DRM. We can all have opinions that are different.
What is productive is calling out hostile behavior and comments that do nothing but hurt the ecosystem. I see these type of strong negative opinions in a lot of areas of the Linux community. “Oh you do X, that’s stupid you should not be using the product like that”
The best possible, most correct, most defensible, most world-improving advice to give for dealing with a user-hostile product or service, is to have the strength of will to reject it and live without it, and live the example to show that it's possible and you won't die.
Or at the very least, it is AT LEAST as defensible a stance as "The more pragmatic/adult approach is to give the bully whatever they want than to go without their product or service".
That philosophy is not remotely automatically more correct or more adult or nuanced or any of the self-serving words anyone typically uses to try to grant their idea more legitimacy than it deserves.
Calling the principled stance "hostile" is itself hostile.
You can phrase it in a way that sounds emotional and shortsighted and jeuvenile, and certainly there are many juveniles who are guilty of that.
Never the less, rejecting a bad deal is still fundamentally a reaction not an action, a defense not an offense.
The publisher promulgating a user-hostile deal is inarguably the offender, the initial hostile actor.
You can decide that the bad deal is tolerable for yourself, but that is entirely your weakness and does not make that policy smarter or more correct than that of those that decline.
I am not here debating DRM or anticheat. Simple pointing out that telling someone the game they play is garbage because it uses anticheat does nothing but hurts the Linux ecosystem.
You can come up with another essay but I don’t think it disproves what I am saying. Telling someone the game they play is garbage is not increasing the Linux user base. I am sure there will be a retort here, “we don’t want those kind of users or related software”.
Who said "we don't want those kind of users"? The game publishers are saying that!
The people you're trying to criticize are themselves only rejecting the software and the publishers that use it, and for a completely explicable and defensible reason, not because it's the wrong tribal colors or religion.
You are consistently neglecting to acknowledge the basic order of operations and ignoring the initial act and offense in order to focus on a reaction that you don't like and to excuse the initial act that you personally don't have a problem with.
I am saying that you only have the right to say that the deal proposed by drm and anticheat systems is acceptable to you, not to go one mm further to say that anyone else is ogbligated to feel the same, and is in any way hostile or harming the ecosystem or anything like that if they don't.
Sorry I am not going down this low brow path. We can agree to disagree. I just don’t think it helps an ecosystem to tell people the software they want to run is garbage.
I do. I think it helps the ecosystem more than any other reaction. I'm not sure we can agree to disagree. I don't think you are allowing it, and certainly I am not.
Actually I think "agree to disagree" was exactly right.
It doesn't mean we accept each others opinion as different yet valid, we still call each other wrong, but the point is it ends there with recognizing and accepting an impasse rather than progressing to pistolas at dawn, right?
Except wait, my whole problem was never that someone was ok with the deal (pay for a product or service that doesn't serve you), it was only with trying to say that everyone else also has to be ok with it.
Except wait again, that might have been your original point too. Not to put words in your mouth but would it be fair to say that you originally had no problem with someone declining to use some software, but only with telling someone else they should do the same?
> What is productive is calling out hostile behavior
Okay; anti-cheat is user-hostile.
> “Oh you do X, that’s stupid you should not be using the product like that”
Okay, the thing I want is to use a game that I paid for, play it on the machine I own, and run it without giving it any special privileges (certainly not modifying my kernel). I trust that you will support that and not be negative about the way I want to use it?
What are you even arguing? I am not here debating if drm/anticheat is good or bad.
I am saying it’s hostile to tell someone who wants to run software but cannot because of a limitation in the OS that it does not matter because it’s garbage anyway.
The players of the game are willing to put up with DRM and anticheat in order to get the game. By taking a hardline stance against these, the Linux community is being user-hostile.
They are not, but both are symptoms of a consumer-disrespecting mindset.
- DRM does not serve the consumer, but the producer.
- Anti-cheat only serves the consumer if it is well-designed. However, if someone is able to design a game (technically) well, anti-cheat is unnecessary. And if someone cannot design a game, their anti-cheat is often a disservice to the consumer.
I don't like either DRM or anti-cheat solutions, not because I am not willing to pay the producers, but because I have been burned too many times by dysfunctional solutions.
> Anti-cheat only serves the consumer if it is well-designed. However, if someone is able to design a game (technically) well, anti-cheat is unnecessary.
That silly "speed of light" thing? Just design better.
And some cheats happen on a different device. There is no way anti-cheat software can defeat those (even eye trackers are not perfect).
The design question is about software that abuses the game state, which is sent to the client, but not displayed to the player (e.g. wall-hacks), and software that sends impossible input (e.g. speed hacks). Anything that manipulates mouse input is very hard to counter.
In the end, all the technical solutions have limits and you need other means to solve the issue (e.g. play with friends/live events). However, anti-cheat software tries to counter many cases that can be solved by better implementations (e.g. servers that send very limited information to the client).
> However, anti-cheat software tries to counter many cases that can be solved by better implementations (e.g. servers that send very limited information to the client).
And now you can't provide client-side prediction between packets, so you get movement stutter all over the place instead of occasional updates and you get somebody popping into existence because they were behind a wall occlusion on your last packet and you've now strafed into line-of-sight. And they got theirs before you did, so you're dead.
Winning, winning result.
Consider perhaps that the people making this stuff aren't stupid and would try such obvious things if they were practical.
> However, if someone is able to design a game (technically) well, anti-cheat is unnecessary.
Nonsense. It's completely impossible to stop cheaters these days, but anti-cheat technology definitely raises the bar. It's only "unnecessary" if you're willing to accept a large number of cheaters.
Some anti-cheat stuff definitely goes to far but to dismiss the idea entirely is just naïve.
Back in the day we had admins and communities of people. You'd get to know people more and establish trust. You could have registered brackets and independent tournaments with manual administration and banning for cheaters.
It worked pretty good, but all of that was taken away.
> It's completely impossible to stop cheaters these days
On that part, we can agree, and if you think I want to 'dismiss the idea' you completely misunderstood the point. My point is, that the cases anti-cheat software tries to solve, are cases that a well-designed game has solved in the beginning (e.g. sending limited game state to clients, discarding impossible input, etc.).
On the other side of the coin, I have seen players who cheated even with anti-cheat in place (like you said), for some games I was unable to play games via proton because the anti-cheat didn't work and I was unable to play some games because the developers messed up their anti-cheat implementation. So there are drawbacks to a feature that has limited use and for which many cases can be solved by other means.
In the end, there are many cheat cases that anti-cheat software can't solve (e.g. using a secondary device) and which have to be solved by other means (e.g. spectator delays, live events, private servers).
> Nonsense. It's completely impossible to stop cheaters these days
on the user side, it's perfectly possible if you only play online with your friends.
The whole idea that we should be able to play with random people if we all accept to have a kernel rootkit needs to die. Ultimately that's exactly what the NSA and other agencies want you to support.
Sure but most people prefer not to have to spend their lives finding enough friends that some are always available when they have 5 minutes free to play one game of Rocket League.
I suppose you could argue that games could offer an "anti-cheat free" version that can only be used in private matches. But I think you can imagine how many downloads that would get.
Well if cheating is going to make the game almost unplayable the outcome is pretty much the same as you deciding to never install it in the first place due to disliking anticheat systems. So I don't really see the problem.
Obviously, you haven't been in a position where you had to patch the anti-cheat solution yourself in order to play the game you paid for.
Well-designed games offer limited potential for cheaters by design. An anti-cheat software can help to eliminate the little potential that is left, but often games are designed without cheating in mind and some anti-cheat software is put in place to solve all the issues that were produced by the bad design.
I think that there are very few tasks in competitive multiplayer games that humans perform better than machines[1], I don't think your statement holds true unless you exclude a huge amount of game genres or you take all the fun out of them. (E.g. no FPSs or ..FPSs with no aiming?)
[1] Unless we're talking about captcha solving competitions, for now, maybe. :)
You're right in that, if your server rejects inputs that are too fast, too precise, too robotic to be human, bots will emulate the top-playing humans ever more closely.
But the question I want to ask is: Is that a problem?
If all the bots and cheaters are playing indistinguishable from high-level real humans, where's the harm?
Or, to quote Westworld: If you can't tell the difference, does it matter?
> If you can't tell the difference, does it matter?
There is a difference in skill level distribution. If everyone playing at a highly skilled player level, then it's simply not fun and doesn't provide an opportunity to get better.
Anyways, playing with cheaters isn't fun and if you want to play without them then you need anti-cheat and/or game to not be free.
But not everyone is cheating. There will always be enough players that even if you just match players based on their skill level, you'll always have someone at your own level to play with.
In fact, I'd like to see the same bots developed by cheaters be used for NPCs as well.
I don't think it would be very hard to develop AI bots which can "see" through walls or one shot snipe you from 200 meters away while you're running. Why would anyone want that, though?
You missed 90% of my comment, and I'm not entirely sure why.
1. It's easily possible to limit cheaters to the same skill level as the top human players. Send no information to the client that they don't need, prevent super-human reaction times.
2. If all cheaters can do is play at the level of the top human players, matchmaking will automatically balance the game for you without requiring any further anticheat.
3. If cheaters have bots that play at the same level as the top human players, you could use the same bots as NPCs and have much better NPC teammates and enemies in singleplayer.
It's irrelevant. I play a lot of destiny 2 and the trials were extremely annoying before anti-cheat.
Imagine you're one win away from going flawless (7 consecutive wins) and some asshole jumps in the air and headshots your entire team in as fast as the gun allows it.
That's not fun. That means you have to start over. You get lucky if you only get cheater(s) in your first game, so there isn't any progress lost.
This resulted in a very shallow matchmaking pool with large skill gaps because casuals and mid-tiers didn't want to deal with this bullshit.
You're complaining about superhuman cheaters. Again, that's something that can be easily prevented.
And if the cheaters can't play any better than the top human players, there's no harm done. At that point it doesn't matter if it's a cheater that's breaking your streak or a top human player that's doing the same.
> Again, that's something that can be easily prevented
Yeah, with anti-cheat.
I think you have never seen destiny 2 PVP maps. They are small. The majority of times, you see the cheater and the cheater sees you. The difference, between cheaters and not-cheaters - you're dead by the time you ADS.
Uhm, yes, I think it is a problem because unfairly losing isn't as fun as fairly losing or fairly winning. Ignorance about the fairness of a game may work in a few instances but would not scale.
You don't have to reach pro levels, it often only takes small assists to turn a balanced game on its head, ruining someone's experience with a game. Repeat often enough and the userbase will leave, feeling cheated or at least demoralised for being unable to compete or improve.
And allowing machine-assists, thus leveling the playing field, turns the game into a completely different one that is (imho) drastically less fun whoever may not be interested in (or may be unable to) running/coding their bot.
Why would you be unable to compete? The matchmaking system will still put you against users on a similar level to yourself. Whether they're your level through cheats or natively doesn't matter.
A player playing cs go at 1280×720 at 30 ps on a ball mouse will always loose to one on playing at 2560×1440 240fps with a high-quality mouse.
Now there's one more dimension of unfairness. But who cares? You're still going to be winning ~51% of the time, that's why matchmaking systems exist.
> Now there's one more dimension of unfairness. But who cares? You're still going to be winning ~51% of the time, that's why matchmaking systems exist
No. That's not how it's going to work. You'll lose 100% games against cheaters Elo and then win 80% (or similar) against lower-level players you get matched against because your Elo goes down due to cheaters. Overall yeah, you might end up with a 50% win rate but that doesn't really matter.
Of course that would be more pronounced in RTS or other 1v1 or team games with small number of players (then again nobody would play them anymore because it would just be waste of time, when you're matched against a cheater because you'll be forced to waste X min before you figure that out).
You don't seem to understand the situation, and I'm unsure why.
If your anticheat prevents any superhuman reactions, you'll have cheaters that will be indistinguishable from the top human players.
How often do the top human players ruin your gameplay experience as an average player today?
Why would it be any different with cheaters indistinguishable from the top humans?
Matchmaking will just give cheaters a relatively high ELO so that the highest ranked matches will be cheaters playing against each other with a few of the top human players thrown into it, competing at the same level.
While for the average player, nothing will change.
It varies by game. https://areweanticheatyet.com/ is an interesting resource for that because they also track announcements by developers about whether or not linux support is eventually planned.
You’re missing the bigger picture. Yes, developers really appreciate that their games work seamlessly on the Steam Deck and Linux with no effort on their part. But there are a couple of knock on effects.
One is that developers now a specific hardware + software combo to test their games with. Even if it’s the same build they’re sending out, they’re still testing their game on the Deck and fixing issues, leading to a better (but not perfect) experience for Linux gamers. Here’s a video of Swen Vincke, CEO of Larian studios playing a game released by his studio on the Steam Deck - https://youtu.be/kzfEkSGa45k. He’s very pleased and promises to test future games released by his studio on the Deck. And he stuck to that promise - Larian released several fixes specifically for the Steam Deck to make Baldur’s Gate III run better. Linux gamers benefit from that.
Second, this increases the % of gamers using Linux. After the Deck’s success in the last couple of years Linux is at 1.91% of the respondents of the Steam Hardware Survey for Nov 2023. Linux was at 1.15% 18 months ago. Doesn’t sound impressive, but if that growth continues and it reaches 3-4%, at that point developers will find shipping native Linux builds more attractive.
Valve adocates are the ones failing to learn from OS/2 history, "it does Windows better than Windows".
Studios don't care about native GNU/Linux, despite the games being shipped with Android/NDK, PlayStation POSIX environment, and the available APIs on Switch OS.
All of them much easier than porting from Windows/XBox, almost straight ports if coming from Android/NDK.
Having a desktop OS was a big thing 30 years ago, but now nobody cares anymore. Who interacts with their OS other than launching browsers or apps based on browsers? Not even most coders these days.
OSes are irrelevant these days and having basically libwindows.so these days only underlines that.
Pity that Khronos never got the support they needed to make cross-platform raster APIs a reality. I mean really, what an enormous and crying shame that a successor to a highly-demanded API like OpenGL never emerged. It's really quite sad that users never had a corporate champion to resist the allure of a proprietary graphics API. The stage was set for every modern OS to be unified under a new raster library, but the setting was dashed for a petty buck. Quite a tragedy.
Ah well, it's funny to see people complaining because it really solos out the OS you're using. Windows users have native DirectX, Linux users have near-flawless DXVK, and Mac users... well, Mac users get what Apple gives them, and they have to learn to be happy with it.
Exactly because of my past history with the games industry 20 years ago, and some contacts I still have, I know much better than random HN commenter ranting about why studios don't care.
Food for thought, not even the studios targeting Android/NDK care about GNU/Linux, despite both platforms having the same 3D, audio and device API relevant for games.
It isn't the APIs that make them not care about GNU/Linux.
Totally, duly noted. Do try out Digital Combat Simulator when Mac figures out the whole DirectX thing though, it's a must-see on recent machines. Ciao!
Valve inventing a portable game runtime that just works on all Linux distros without game studios needing an entire department to handle the dependency hell of Linux NIHisn would solve that issue.
Does it actually work though? Ironically my experience is that windows api + proton are a more stable target than anything linux native. Even valve doesn't get it always right when shipping linux versions of their own games. See https://steamcommunity.com/sharedfiles/filedetails/?id=30358... for example.
I wanted to do use SteamOS for our LR PC, our kitchen ambiance PC and our MBR PC but instead installed Ubuntu (upgraded to Kubuntu) then disabled Snap because SteamOS which runs KDE and was a great call by Valve, is built on Arch, a bad call IMHO.
I'd be curious to hear why. Arch deserves it's reputation for poor stability, but for Valve's application with OSTree and immutable root should work fine. For users who don't want to tinker, they can receive a quality first-party experience with smooth upgrades. Users that do want to tinker are largely funneled into using Flatpak or AppImage, which are much more stable than AUR packages.
Can we please stop with the FUD around Arch and poor stability? It's an old meme which will never die, but it has no basis in reality. I've been using Arch on my personal and work laptops for probably 7 years now and the only time it had been a problem has been due to layer 8 issues and doing something stupid. I certainly wouldn't be using it for work if it was unstable.
I used to run arch on my desktop for a while because it was the closest to thing to FreeBSD. Didn't use Linux on the desktop before (well, went through a few distros within a month).
Never had any issues with updating or using it, but it's because I have set up everything myself and never touched AUR. Still nuked it favor on FreeBSD and then NixOS eventually tho.
This is true, I guess it's very dependent on the use you have for the machine. In my case i migrated from Ubuntu to arch on my work machine when I had a week off - obviously I needed it to work otherwise I would be in some trouble the following Monday.
It's not FUD. If you stay very light then it is very stable, but the more stuff you add, the worse it gets (gnome extensions anybody?)
I love Arch, but it is a demanding mistress. If you get behind on updates, you're asking for pain. Also it can be very disruptive to suddenly get a new major version of Gnome that breaks extensions you used, or applications, etc.
What we instead should say is not that Arch is "unstable" because I agree it's not, but rather that Arch requires a lot more care and feeding and if you don't do that, it can lead to instability
I used Arch for years, and left it due to poor stability. Every time I would try to use an AUR app it would be broken and need re-installing. Sure the non-AUR stuff was mostly fine, but a lot of necessary applictions are in AUR, and AUR is touted as a major selling point of Arch. When there was an issue during a system update, recovering the system was a mess. I also cannot call it stable when you can't update one application without updating the rest of the system.
I switched to Gentoo and it fixed all the issues I was encountering with Arch, and was more stable. Now I'm on NixOS, which is far more stable than Arch or Gentoo were.
Now, that said, the way SteamOS uses it, I don't see any issues. With an immutable system, A/B updates, and tested images, the compatibility and update issues are solved. Using flatpak for user applications solves the rest of the noted issues. Would be ideal if I could install with Nix instead of Flatpak, but ran into some trouble there.
Counterpoint to this; I have many packages from the AUR and I've never had any issues like you describe with them. Both of our viewpoints are polar opposites but they are only a single datapoint each.
The most interesting part of the article was the mention of a /nix partition, as I didn't realize the steamdeck supports nixpkgs, after researching it more, they do indeed (not installed by default, but at least it is possible without having to fork an entire os to get it on the device).