Hacker News new | past | comments | ask | show | jobs | submit login
Valve is paying open-source developers to work on Proton, Mesa, and more (reddit.com)
630 points by raybb on Dec 17, 2022 | hide | past | favorite | 278 comments



Linus at DebConf 2014 https://www.youtube.com/watch?t=310&v=Pzl1B7nB9Kc&feature=yo...

>I’m on a record saying, that maybe Valve will actually save the Linux desktop. And it’s actually not because I think games are important! I don't care, I don't play games. I think some people do, so games maybe important. But the really important issue is I guarantee you Valve will not make 15 different binaries. And I also guarantee you that every single desktop distribution will care about Valve binaries. The problem is Valve though build everything statically linked and it creates huge binaries. And that's kind of sad but that's what you have to do right now.


Id rather have static binaries than a flatpak/snap for "universal distribution", personally.


Given the choice I prefer native package -> AUR package -> AppImage -> Flatpack, in that order. (snap, you'll notice, is absent)

Arch is pretty good at making the native packages work, so that's always ideal. The community is generally pretty good at competently packaging AURs, and they'll do it for slightly less free things, so that fills a lot of gaps. AppImages and Flatpaks are great when they work, but opaque and difficult to diagnose when they don't :/ And snap is just... unpleasant. It's insistence to auto update separately from my package manager has really turned me off of it's whole ecosystem. No thanks.


Why AppImage over Flatpack? I like the Flatpack ability to easily restrict network access to binaries. Probably possibly in all of the packaging formats, but limiting permissions feels like a functional piece of Flatpack.


Because AppImages need no daemons and plumbing. They are self contained binaries which contain everything it needs to run.

Running a big infra to run a single binary once in a while feels like a burden to some of us.


Flatpak also doesn't need daemons beyond what the app is using.


Yet you need to install flatpak package and its dependencies to run flatpak applications. Not to mention the flatpak package contains a four ".service." files, too.

They might not be triggered and launched all the time, but this mode of operation is different from .appimage files, which are just ordinary binary files you directly run.


flatpak > appimage for security and updates.


Well you can statically link stuff in a flatpak… a distribution package however gets automatically rebuilt when a dependency changes, if it's statically linked.


sorry, is this an argument that appimage has advantages over flatpak for security and updates? could u elaborate?


No I'm saying that flatpack can be as bad as appimage because nobody checks.


Snap really sucks because it forces you to have a /home/user/snap folder on the same filesystem as /. If you don't have that (e.g. home folder symlinked to another filesystem), then snap starts breaking in very weird ways.

https://forum.snapcraft.io/t/limitations-in-snapd/9718

This actually means that at some point I will have to move away from Ubuntu.


Debian is very nice these days if you don't mind a tiny bit of enabling non-free repos. Or Pop OS for a more It Just Works approach, if their desktop suits your fancy.


Right now I'm stuck with nVidia's Jetson line of products which is vendor-locked to Ubuntu. The problem is that while you can extend them with a large SSD drive, these products have a really small disk used for /. I could run another OS inside a Docker container, all to avoid this limitation in snap, but of course things will not become prettier.


You've been able to run the root on (nvme) SSD for a while now with Dusty's scripts, and more recently with the official installer.

I have great sympathy though, I am also using Ubuntu on Jetson and resent snap, especially since the new releases are 20.04 which have it more tightly integrated.


You can change the root parameter in extlinux.conf to point to whatever block storage you want it to

Or reconfigure U-Boot to load extlinux.conf (and the kernel image, initrd, etc) entirely from other type of media. IIRC NVMe, USB and external SDs are all supported


Thanks, I'll have to look into it.


I don't see anything there about needing to be on the same filesystem, just that the path is of the form /home/$user/snap. It says that having a home directory that isn't mounted at boot doesn't work, but that's different. Am I missing something?

I'm not sure why you'd need to symlink /home/ to another filesystem instead of just mounting that on /home/ directly, that's what I'm doing with root on ZFS and a separate dataset for /home/ with it's own mountpoint.


Snap also stores stuff in /var/snap, and that probably also has to live on the same filesystem.

Anyway, now I'm working around limitations in Snap instead of Snap working around its own limitations in certain circumstances. The fact that Snap shows weird behavior and illogical error messages in these cases tells me that Snap is not (yet) up to the task of fulfilling this rather fundamental role in my OS.


Why don't you just mount your big disk on /home, like we used to do in the olden days?


Because I suspect that snap uses both /home/user/snap and /var/.../snap, and needs them to be on the same disk (to allow hardlinking, I suppose).


Have you tried it?

Sounds like something a bindmount might fix.

Back in the olden days before simple and convenient packaging systems like snap had been invented, we had to get pretty creative so that stuff didn't freak out because it was split across filesystems.


I haven't tried it, but I can remember people on forums having mixed results with a bindmount. I guess there is no guarantee that it works until Snap has been running for a while without problems.


Either static or dynamic is fine, package management has to be solved at another layer a la Nix.


Nix is effectively transforming dynamic binaries to static ones. Might as well skip the step, stop pretending that drive space is expensive, and link everything statically.


No. The Nix approach lets you retain the reuse, and it can be extended to allow quick, unified security patching as well (which has been done already in the Guix project).

And if you really want a single redistributable binary, any existing package can easily be bundled into one without recompiling it.

Traditional static linking is strictly worse.


Huh, never heard it put this way. I guess it's true though.

Anyone disagree?


Farid Zakaria recently gave a talk¹ (at NixCon 2022) where I think he put it better: Nix (and Guix) blur the line between dynamic and static linking.

I think this is true, because with Nix and Guix, you actually have an end result with the positive characteristics of both kinds of linking.

His talk, based on his work on shrinkwrap² and nix-harden-needed², called for the creation of a new executable format for the Nix world. Seems apropos as a reminder that there are other possibilities!

(I think these kinds of solutions are definitely within reach for Valve from a technical perspective. Other tools are using them today.)

--

1a: https://youtu.be/HZKFe4mCkr4

1b: https://youtu.be/lzk4sldexX4

2: https://fzakaria.com/2022/03/15/shrinkwrap-taming-dynamic-sh...

3: https://fzakaria.com/2022/09/12/making-runpath-redundant-for...


What about completely optional dependencies which might get loaded at runtime based on some dynamic criteria? Dynamic libs are not primarily for space saving. Also, there are thousands of other packaging problems not solved by static binaries, so yes, I do disagree with grandparent commenter :D


Conceptually that's kind of true, if you only consider the end result for a given binary, (since it would load only shared libs as if they were statically linked).

But multiple derivations can share dependencies, which does save space and also makes it easy to identify dependencies of a given derivation, where statically linking would make that opaque.

Also, the story goes way beyond binaries, as dependencies often carry additional resources not bundled into the binary.


RAM is still expensive…


appimage > flatpack > snap


I agree static binaries or at the very least something akin to nix where every binary has a pinned dynamic lib.


Please, $DEITY, deliver us from this hell wrought by Canonical.


Unless we’re talking about embedded systems (and most of the time, not even then I’d guess for portable gaming devices for example), what is the actual drawback of these “huge” binaries?

Wasn’t the reason to separate binaries and libraries in the first place to conserve disk space? They are then clearly versioned to signal that you can’t expect just any version of the library to work with the program you wrote.

The trade-off in using more disk to contain “an application” to “a binary” seems fine to me, if you’re not sacrificing anything else.

The only drawback I can think of is that it makes very minor patches, like security fixes, to the library impossible, which is a nice thing to have I guess, but on the other hand you’re now giving the library maintainer the capability to crash your application, and since that would presumably go through some kind of review and rebuild of the application in the first place to avoid crashes against the new library code, why does it matter if the lib is a part of the actual binary or not?

Is it really down to nothing but bandwidth and disk usage, or am I missing something here? Because if it is, and “blob binaries” becomes the norm, I’m sure you could still figure out partial patches somehow to conserve bandwidth and IO, while still keeping the binary essentially a built monolith.

Maybe one argument is that abandoned programs can still be “patched” in a way because a library it depends on fixes an issue that library was responsible for, but doesn’t that seem pretty risky to begin with to run software that is no longer being patched?

Indeed what constitutes “software” just seems to be multiple pieces in one case, and one piece in the other, no?

The biggest case against “vendoring”, which in my mind is similar to “huge” binaries, that I can think of at least is that it gives developers a false sense of security to not have to think about keeping libraries up to date: but aren’t they already equally able to be sloppy about this when the libs are separate?


If we're talking about binaries for game titles, meh. Load time for the executable is utterly insignificant compared to the load time of a title's assets. Heck, shared libraries probably slow things down.

Typical games I have experience with were in the 5-20Mb range for executables, with tens of GB for assets. Most titles I looked at the performance of were terrible at scheduling I/O, often leaving 80% of their available disk bandwidth unused.


Shared libraries can reduce load time as they might already be in page cache. There’s also potential for a small impact in reduced instruction cache misses.

On the other side, link time optimization on static libraries allows deletion of unused functions and more aggressive inlining, so you might get that performance back and more.


People say this but every test case I've seen has proven otherwise. Static binaries load much faster. Maybe it depends on the size of the libraries involved an the quality of the OS' dynamic linking machinery (supposedly OSX has better runtime dynamic linking startup times than linux), but at the very least it is conditional, not a pure win.


Not only that, but executable images are demand-paged anyway. You can link 1 GB of static library code if you like; you won't pay the penalty unless and until those functions are actually called.

The whole debate is just stupid. If it inconveniences even one fewer user, then static linking is the way to go, because of the sheer number of drawbacks that shared libraries bring. Nobody cares how big the executable is. If you do, well... the best time to stop caring was 10 years ago, and the second best time is now.


But the called functions need to be loaded in memory for every process using them, instead of once.

Also there is value in being able to fix a library without having to recompile everything.


Also there is value in being able to fix a library without having to recompile everything.

Number of times I have benefited from this process: 0 (AFAIK)

Number of times this process has hosed previously-working applications and components: too many to count

On Windows, this "feature" was called "DLL Hell," and was generally considered a bad thing. Nothing has fundamentally changed since it was a common practice for every installer to have its way with c:\windows, IMHO. (Admittedly shared MSVC runtimes aren't as big a problem as the Winsock-provider-du-jour was, but still... there is simply no point.)


> Number of times I have benefited from this process: 0 (AFAIK)

Every day on my distribution, when I have to download 5kb of library instead of 3GB of everything recompiled…


Everything is getting recompiled all the time anyway. At least with FreeBSD, any time a dependency is updated, all the packages that depend on that package get rebuilt. There's no concept of "this was a small update and its dynamically linked", its just "the version string changed, so rebuild everything that lists it as a dependency". This is particularly "fun" for web browsers because they have a lot of dependencies, ensuring they're rebuilt often, and they take a long time to build.


> Everything is getting recompiled all the time anyway.

Source? I know when stuff on my distribution gets recompiled because then I have to download and unpack it, so I notice.

I can assure you that it isn't as you say.

> with FreeBSD

Despite the monorepo thing, I assure you that most FreeBSD users do use software coming from other places as well.


If nothing else, you probably see several megabytes' worth of JIT compilation whenever you visit a JavaScript-heavy web page. Same effect... namely, none at all.


Lol sure… JIT only does the hot paths, not the whole thing. It certainly will not make several mb of output without my computer sounding like an ariane5 launch


> Unless we’re talking about embedded systems (and most of the time, not even then I’d guess for portable gaming devices for example), what is the actual drawback of these “huge” binaries?

The functional package management/build system design instantiated by Guix and Nix shows that you can have the best of both worlds when it comes to static and dynamic linking, even without any form of containerization. Guix in particular has a feature called 'grafts' which implements a kind of hot patching for libraries in programs which are nominally dynamically linked but which in fact never search for libraries other than the exact versions they were built with.

So there's no need to choose between the portability of static linking or the efficiency and flexibility of dynamic linking. You can get all of those things together, today, for many thousands of packages.

If you're not familiar with the basic outline of these build systems, the idea is basically this:

  - every package gets installed to a unique, quasi-content-addressed location
  - each such location has its own FHS-like tree which gets treated as a prefix for that package at build time
  - every package gets manually pointed to each of it's dependencies' unique, full paths at build time
this ensures that each package always tries to load the libraries it was built with, and allows multiple versions of libraries to safely exist side-by-side, even when everything is dynamically linked.

To ship a package, you then just ship the whole dependency closure, which gives you the portability that motivates people to distribute static binaries.

Here's some info on Guix grafts, for an idea of how security updates can still be managed in such a system.

The official docs for grafts: https://guix.gnu.org/manual/en/html_node/Security-Updates.ht...

Original blog post/announcement for the feature: https://guix.gnu.org/blog/2016/timely-delivery-of-security-u...

Detailed writeup of further implementation details: https://guix.gnu.org/en/blog/2020/grafts-continued/


> what is the actual drawback of these “huge” binaries?

The problem isn’t size. The problem is that with static linking it gets significantly more difficult to patch security vulnerabilities, because with dynamic linking, the vuln is gone once the underlying library is updated, however this wouldn’t happen with static linking.


You need to be regularly updating the higher level packages anyway though, as they might contain vulnerabilities that are not due to underlying libraries. At which point, we might as well update them for library issues too.


But then you're relying on every single higher level package owner to pull in patches for all their dependencies and update their binaries in a timely fashion. That works for, say, Debian packages (my distro of choice), but as soon as you move outside of that circle it'll make it harder to guarantee quality.

In contrast, on (for example) Debian, I know that the Debian Security Team is looking after the libraries and that gives me a lot of peace of mind. Instead, I'd be wondering if every single third-party apt repository package owner is doing a professional job. At least if they're shipping binaries that are dynamically linked to Debian packages, I can have confidence that those library issues are being handled. (XML parser update, anybody?)


Many games rarely update, though. They put out the original version, and then maybe a few patches to fix game-breaking bugs, and then that's how they exist from then on. If you're expecting them to update to address vulnerabilities, you're going to be very disappointed.

That said, if the library changes the API or functionality at all, dynamic linking is going to break the game, and that's not good either. So gamedevs are going to opt for static linking and call it a day rather than deal with support requests.


For single-player games, I doubt it matters if there are unpatched vulnerabilities.


Many single player games do still phone home or otherwise cause network activity, so it’s not like they are guaranteed a safe bubble in which to run.


That's true, sadly.


This is only a problem if the tooling to track dependencies doesn't exist.

Many programming models (e.g. monomorphization as done in C++ and Rust) are fundamentally incompatible with dynamic linking. The tooling needs to handle those cases anyway.


I just checked and I have 5600 binaries in my /usr/bin. They take an average of 100kb. If statically compiling them means an average of 50M that means 281gb.


You're arguing against a strawman saying that no binaries should ever be anything but statically compiled, including system binaries bundled with the operating system. I certainly did not say that.


A tiny bit off topic, but for the record: video games are, for me, extremely important. They are my favorite art form and an oasis in a daily nightmare; if I were unable to play my favorite games, the system I am running would be completely useless. Fortunately, the kind of games I like run mostly fine on Linux.


Why would I use Linux Desktop over Windows Desktop? Because it's FOSS, because of price, UX? I don't get it.


I keep Windows around for Serif products and (currently, due to a Mesa bug) AOE 4. I use it for nothing else.

The amount of popups, nags, helpful hints, setup your system (which is nothing more than an interstitial to reset your browser to Edge+Bing) and other bullshit I get when logging in is mindbendingly infuriating.

This is after only being mostly away from Windows for a year. It's a shit product. I was merely a happy toad in some hot water. Things are much nicer once you actually climb out of the pot.


All of the above.

It's free (along with everything in the "app store"), it's stable, it isn't infested with ads, and the browser and word processor work exactly the same for all normal users. The OS shouldn't matter much to regular people who just want a web browser and maybe a word processor.

It's really easy to use now. The installation process is far less arduous than Windows and you don't have to sign up for a Microsoft account or get nagged about not doing so.

I would really struggle to imagine someone having trouble doing all their normally-Windows-based (filesystem, office suite, web browser) work on a mainstream distro now, even for the most technically illiterate user.

If you're a power user though, it's like moving from a Corolla to an F-35.


> Why would I use Linux Desktop over Windows Desktop?

For example it doesn't keep installing "apps" that are just ads for netflix/spotify/tiktok/whatever without me ever having wanted them to appear.


It's like asking me asking you why I would drive a Mercedes over my BMW. How would you know? All you could do is to list a million things that are different on a Mercedes compared to a BMW and then claim every one of those differences makes a Merc better.

Is that what you expect to get as replies to your question?


For a casual user who uses computer for work and gaming I would go with Windows. People say Linux is free and open source, more secure and you get privacy but is it really worth it? Idk.


To me, Linux means having the option of a system that lacks distractions. That’s a drastic ux improvement for me personally; IMO windows UX peaked with 2000, and XP was nearly as good.

I haven’t played much multiplayer anything (a little MTGa, among us, tabletop simulator and beat saber) but none of the single player games I’ve tried on Ubuntu have had any issues; just open steam and press play.

Windows started pushing f2p games in the start menu, changed UI toolkits twice in two generations, etc; it’s become unpleasant for me to use. Apple have similarly dropped the ball on UX in the past 5 years. IMO Linux UX has caught up thanks to those companies failures.


What are some DEs / window managers you think are quite simple and let you focus?


I’m using classic i3, with a nearly stock config. I have a key binding to start a launcher (in lieu of a start button).

I3bar to indicate which workspaces have new chat messages etc.

Default black background, all non-critical notifications disabled; I get distracted easily, so I craft an environment that avoids distractions.


simple window manager - how can you say i3?! most people wouldnt know how to do anything in there


Most window managers are easy, but not simple. i3 is simple, but not easy.

In general, I'd prefer to have something both simple and easy, but in most cases that's not an option.


Xfce is nice.


I was a long time Windows user. Since the 3.11 days. Windows was originally very much "You do you". Outside of Domain-joined machines, it was much more of a suggestion to update. Thing would update when you restarted/shut down.

A basic Windows install had some basic tools and utilities, but they largely didn't push things on you - and once you had set preferences, that was that.

Then Windows 8/10/11 came around.

Ads through the UI. A half-dozen or more sponsored bits (both 3rd party and their own other products). You'd log in one day and find they'd installed some other piece of crap. Features were disappearing behind the Microsoft Store. You'd try and set up a machine, and it'd play stupid games about "Oh, you need a Microsoft Account to log in", hiding/removing options to skip if it detected an internet connection.

You'd install Chrome or Firefox, and it would beg and whine. They changed up how you set browser preferences twice, to deliberately make it harder to switch. It would "helpfully" reset browser preferences back to Edge "for security".

Updates were force-installed, regardless of whether you wanted to. I remember being at a conference and hearing from maybe a half-dozen presenters who'd all had Windows push an update during the conference. Some it was just before they were about to present. Some had their machine sitting in it's various phases of Updating for an hour or more. Some had Windows reset a bunch of other preferences/configuration, and fuck up demos they had planed.

For me, the security and privacy benefits are secondary to "Let me use the machine that I'm paying for".

Ubuntu/Pop_OS might prompt me to install updates, but I've never had it tell me I have 15 minutes before it is going to force applying updates, 5 minutes before I'm supposed to join a meeting. Ubuntu might have some sponsored bits, but they're gone when you remove them.

The end of Windows 7 was the end of me using Windows, because I just can't trust the OS to do what I want, when I want.

This isn't just me, either - I've converted three other non-techie folks over to Ubuntu, too - all they did was stuff in Chrome and the occasional bit of printing. They had similar issues with Windows forcing changes on them that they didn't want. Multiple times I'd had to reset browser settings, because Windows decided that Edge was better. So Edge imported their contacts, but not the Chrome apps and couldn't sync passwords to the Google Passwords service.


didn't valve mostly mitigate the fat binary problem with their soldier runtime set of shared libs?


Yes, and then they realized that even that wasn't good enough because Linux Desktop ABIs are a shitshow, so they refocused on providing Windows ABIs instead to great success.


huh? When I run a Linux native game via steam it uses Windows dependencies? What is this dark magic.


I think they meant Valve switched their focus to emulating Windows through WINE, practically adopting Windows' APIs, instead of solving Linux' dependency problems and pushing developers to compile for Linux natively (something they've always been reluctant to do).


I'm going to go out on a limb and say that excluding work related stuff gaming is very important to those who own a Windows PC.


Space is cheap for now, so I'll accept that


"I don't play games, I think some people do."

Multi-billion dollar industry raking in higher profits than Hollywood and that at this point is an unregulated online casino for teens.

Yeah, some people do...


Well, yeah. His comic timing’s not the best but I think that was meant to be the joke.


Yes, very dry, if not awkward, humor. I actually think Linus is quite funny in most of his interviews.


Would have never guess it was meant to be a joke.


To be clear, you're joking right now, right? I think you must be, but given your previous comment I'm not sure. Dry sarcasm in text is hard...


No, I am not. Linus is not exactly an average person; it seems perfectly reasonable to me that among his many eccentricities would be the belief that only some people play games on a PC.

Maybe I have been reading too many of his emails.


You're not alone. I wasn't able to discern that quote as humor either.


He's aware that video games exist, so obviously he knows some people play video games. I don't really see how you could confuse this.


I'm one of the developers contracted by Valve to work on gamescope, wlroots, Mesa, the kernel, Wayland, etc. Really happy that my contracted work helps not only SteamOS, but very often the whole ecosystem. Examples include radv/amdgpu fixes, tearing page-flips, a re-usable library to make use of KMS planes, and the list goes on.


Can I ask how it's decided what you are working on?

Given the variety of the projects you've mentioned it could seem that you have some degree of freedom, but I might be mistaken.


Yeah, there is a lot of freedom, it's pretty similar to how an open-source project works. Pierre-Loup indicates what features would be nice to have (in gamescope or other projects), I can pick from that list. Or I can pick my own ideas and work on these. Or I can discuss with the other contractors, come up with an idea and coordinate our work. Sometimes I work on short-term bugfixes, sometimes I work on long-term plumbing efforts.


I assume they're referring to Pierre-Loup Griffais, a Valve engineer dedicated to the Steam Deck and Proton.

https://twitter.com/Plagman2


That all sounds great. I must say I'm quite envious of your job. Getting paid to work on something that benefits the whole community, and with a great level of freedom!

Unfortunately my skillset as a senior full stack dev seems rather irrelevant for jobs in and around OSS. It appears those jobs are mostly C and C++.


is it public what they're paying these contractors?


I recently interviewed for one of these open source dev positions (Proton). From what I could gather from my conversations with the hiring manager, its somewhere between 120K to 160K for C Devs.


No, but why would it be. I'd expect a competitive pay.


Thanks to Valve Linux is single OS on my home computer. Most of the games from my medium sized Steam library are working, and few that does not work today, will probably be fixed in the future. Need to mention that I do not play new AAA titles or multiplayer games, but I enjoy older games, and some of those are broken on modern versions of Windows.


Single player gaming, indie gaming, and retro gaming are all gaming niches where I'd say that a smooth Linux experience with widespread support is 'already there'.

It's really only the (crucial) segment of big, trendy multiplayer titles where support is still really hit-and-miss. If gaming is essential to your social network, Linux may not be ready for you to entirely switch over.

But if you're someone whose social hobbies lie elsewhere who enjoys solo gaming, there are way more excellent Linux games ready for you to enjoy than there is time for you to play them.


> It's really only the (crucial) segment of big, trendy multiplayer titles where support is still really hit-and-miss.

I was disappointed when post-purchase Psyonix dropped official Linux support for Rocket League. However, the Proton support has been rock solid, and I'm yet to encounter any crashes, even after updates. The Linux binaries occasionally had regressions after an update.


Yes, the only reason I keep a windows install around at all is for MW2, Tarkov and Hunt: Showdown with the boys.

From memory, Valve has also been supporting anti-cheat providers to port to Linux (or perhaps I have misunderstood and they will also run under Proton?), so there’s always a chance this will be solved in the next couple of years. The Steam Deck has been a great catalyst for that within Valve, it seems.


I think some publishers are reluctant because on Windows, those anti-cheat systems involve out-and-out rootkits, so allowing Proton support for them is seen as a relaxation of their 'security'— the Linux ports of those systems are userspace only, Proton and Wine don't have that kind of kernel access anyway, etc.

I think if, over time, it looks like those fears are unfounded and there are few practical problems with anti-cheat for competitive games on Linux, then probably some currently-hesitant publishers will relent and enable their games on Linux. But only time will tell, I suppose.


Afaik what I have seen with the steam deck the problems come mainly from old title that are probably not a priority or big new titles with anticheat or denuvo


I played all of cyberpunk shortly after release on a 3950x and 3060ti on Steam+Proton+Ubuntu with very little problems. It did get very laggy in busy areas on occasion, but that was the exception not the rule. Obviously newer hardware helped, but it seemed Valve hurt the gaming experience less than CD Project Red did!

Not really a gamer. Cyberpunk is probably my high water mark at 90 hours. It's so nice to not to have to make major concessions to run Linux.

Valve you have earned a customer with me.


That's really impressive, considering Cyberpunk had so many issues even on officially-supported platforms


Yeah same here, it was sort of satisfying watching all my nvidia/windows buddies whine about bugs while it seemed to run buttery smooth on my amd/linux machine after they all shilled windows/nvidia "ok man good luck gaming on that setup, i hope it works out". It did lol, i gained frames!


For a while Elden Ring ran better on Linux, due to game-specific patches on DXVK that fixed the stuttering seen on Windows.


I really wish they would take some time to make their VR hardware work properly on Linux. Right now it is pretty much just that holding me back. It is sad that they are the company that gives a shit about Linux gaming and the only one that has a working VR headset for Linux, but only barely.


Same, I got interested in trying it a couple years back and steam's support for games is what made it seem so accessible. I enjoy a mix of smaller games and AAA gaming but even it's AAA game support is impressive, I can play a good chunk of the CoD games and have been playing the recent battlefield and battlefront games with no issues. I've also been playing RDR2, GTA, DOOM, Borderlands, Cyberpunk, Shadow of War, Apex Legends, and many more with only a couple issues every once in a while, which is a tradeoff I'm willing to make. The only games I miss are Fortnite, Dead by daylight, and Rainbow 6 Siege, which I end up playing with friends using Nvidia Streaming.


I guess I struggle to understand why Linux and Windows are always pitted against each other. At this point, they're both good for different things and dual booting is relatively easy. Storage is cheap and Linux is small. I've been dual booting for the last 5 years and I'm totally happy. I use Windows for gaming/media and Linux for development. I wish everyone would stop trying to fight this war as if only one operating system is allowed to exist.


I had good luck even with newer titles. Right now I'm playing Red Dead Redemption 2, and it works like a charm, out of the box. Same flow as in Windows, even.


If anyone from Valve is reading this, I'm not a gamer at all, but I bought a (fully loaded) SteamDeck largely to support your efforts, and secondly to have a cool portable computer. It is (in about a week) going to become a game console hooked up to my TV. I could have got an xbox for much less the price, but the openness, power, and linuxyness of the steamdeck was too attractive to pass up :-D

Please keep it up! you're the heroes we need in this fight.

P.S. I'll be buying games from your store also as my kids are gamers


Has anyone successfully run SteamDeck's version of arch on a pc for gaming+dev? I'm looking to do this but I'd love to hear anecdotes


You can simply use the steam decks UI on any Linux system and it works great on AMD. Hardware acceleration on Nvidia is somehow super messed up still, so not recommended for those just yet.


HoloISO is what you're looking for: https://old.reddit.com/r/holoiso/


I hope your kids are not expecting high end gaming experiences and are happy with simple indie games (which usually are the better games anyways), because the SteamDeck cannot and never will replace something like an XBox.


Steam Deck comfortably runs high end AAA games like God of War, Horizon Zero Dawn and Cyberpunk 2077, in many ways with comparable performance of the base Xbox model.

Which makes sense since the hardware is similar.

So I have no idea what you're talking about, it can easily replace Xbox.


Eh, for me it’s largely replaced my extremely beefy gaming computer. I use the Steam Deck for so many games, including AAA titles like the Final Fantasy VII remake. It’s a shockingly good little device.


This is a small selection of the games I’ve recently played on the Steam Deck and been surprised by how flawlessly they performed and played:

- God of War

- Elden Ring

- Final Fantasy VII Remake

- Forza Horizon 5

- Jedi: Fallen Order

- Spider-Man Remastered


This is one of those statements that are going to sound hilariously wrong in a few years, when third-party OEM will be able to put SteamOS on their custom hardware. The hard part was making Linux run games, and Valve has done it.

Xbox, PS and Nintendo won't die, but there will be a fourth player, and if anyone has to die first, it'll be Xbox, due to shrinking mind share and lack of third party titles.


EDIT: I meant first party titles. The list of exclusives on SONY and Nintendo consoles demolish Xbox right now.


Hate to say this but if the SteamDeck becomes too popular compared to Desktop Linux, then that's an incentive for the company to make the software more closed again (not anyone's fault, but just how the world works). Better buy only the games if you want to support them.


I bought Steam games and a Steam Deck just to support Valve... and I'm not really a gamer. However, the Deck has turned out to be a pretty good computing device.


I was thinking of doing this - how do you like it? How's coding on it?


Doable on the device itself (Although I wouldn't bother without at least an external keyboard).

Full KDE desktop based on Arch (/ is read-only by default, but this can be disabled)

it Really shines when plugged into external monitors/dock. In particular, I have a couple USB-C displays that provide usb-hubs - I plug a single usb-c cable into the top of the thing and have a great little mini-workstation machine.

Does 4k displays just fine and is very responsive for the size. It's basically replaced the linux laptop I would drag along on trips. If I need a workstation it suffices, and I can plug it into hotel room tvs really easily to watch whatever I want.

Basically - hard to argue with the value proposition given it only costs $500 and I get a great little portable workstation. That fact that it does an excellent job with games is just cream on top.


Not op, but I bought one. It's essentially a Linux laptop with a control interface built in. For emulators and steam games, it's really nice to have an option to play on the tv (with the dock). I don't see it being an especially nice coding experience; you'd need an external keyboard, mouse, and monitor. And without a dock, all it has is a single USBC port. Though it has a full KDE desktop, so it should be feasible.


yeah without a keyboard it sucks. it's like trying to code via SMS lol.

But the dockability makes it pretty damn awesome. you can code in first class style by docking it with a monitor and keyboard, and then when you leave you can take it with you and get a great portable game player or movie watcher or web browser with good battery life. the built in screen is pretty good. I was impressed


I'm using one from time to time to do some programming on minor projects. Works pretty well, although it's not as fast as the laptop or desktop (obviously) that I have.

But it's pretty lightweight and handles games well, so if I go on trips somewhere, sometimes I just bring the Deck + small keyboard to do emergency fixes on, and for that, it works very well.


The deck plus the OEM dock with mouse, keyboard and monitor makes for a nice, cheap Linux desktop that you can grab-and-go to play games. The dock is really seamless, always works right away. If you're programming, I wouldn't recommend Deck without the dock.


If you're going to properly sit down with it somewhere it works fine. But you're going to want external keyboard/mouse, and most likely a bigger display. You'll need a hub/dock if you want to plug in more than one USB-C device.

Honestly a laptop makes a lot more sense in general for that still. You just pull it out and everything you need is there. But if the versatility of the device is appealing to you then it might make more sense. I have used it for traditional computing with a keyboard/mouse and portable monitor, but it is quite cumbersome to do so. But I value it much more and primarily as a portable games system. As an aside, the trackpads are great and for me the thing that sets it apart from any of the similar systems, but in desktop mode they aren't nearly as good or useful by comparison. Though it might be possible to change the way they work in desktop mode but the basic way it hooks input by default makes them pretty dumb trackpads.


It's definitely great to support Valve in their efforts, but you all might be interested in GPD's lineup of small computers, because many of them have built-in keyboards:

https://gpd.hk/product


I keep it docked (mostly) and it works great.

My ONLY beef is that the AMD drivers do not work 100% with Blender so you don't get the accelerated Cycles rendering. Otherwise, it is great.


That's interesting. I'm no expert on the Steam Deck, but I would have expected it to work. Do you have any more details about the problem?


Valve is using the 'wrong' AMD gpu driver.


That's not enough information for me to understand what's wrong, but I suppose maybe I should find a Steam Deck and try things out myself. If you want to follow up privately, my email is in my profile.


There's actually 3 drivers for vulkan, amdgpu-pro, amdvlk, and radv.

In any case, ROCm works with the completely open source stack, and is open source itself. amdgpu, the kernel component, is the only thing needed for that, and modern upstream works fine. The integrated steam deck gpu simply can't work with it, though.

The issue that has recently been resolved was that only the amdgpu-pro userspace had working OpenCL 2.0+ support. Mesa 22.3 has better OpenCL now, but that's actually become less relevant with the new HIP/ROCm stuff.


> The integrated steam deck gpu simply can't work with [the kernel amdgpu], though.

That's a shame. I was hoping it was something I could help with, but that's a bit beyond the scope of what I can fix myself.


not sure why you added that "[the kernel amdgpu]" bit -- pretty sure that isn't what gp was saying.

the open source kernel driver is the only one in use and sits under both the open source and closed source userland driver stacks.

i'm pretty sure what they're saying is that rocm doesn't support running on the apu in the steam deck at all -- i'm dealing with a similar issue in bringing up a product on a different amd apu that has a beefy enough gpu for my application


If so, that's more promising. The HIP runtime sits on the ROCm Runtime (ROCr), which sits on hsa-kmt (ROCt). I was under the impression that the driver handles pretty much all of the stuff that is hardware-specific.

It's an RDNA2 GPU and AFAIK, all the RDNA2 GPUs can be coerced into working with ROCm (even if they are not 'supported'). Maybe the Steam Deck is an exception. I don't know. I'll give it a spin over the holidays. Win, lose or draw, it sounds like I'll learn something interesting.


I borrowed a friend's Steam Deck and created a Debian userland to check if there were any issues using the hardware for ROCm. As far as I can tell, the only reason ROCm doesn't work on the Steam Deck is that the necessary software has not been packaged for SteamOS yet.

When I check rocminfo, the hardware is reported as gfx1033, which is an identical instruction set to gfx1030, so export HSA_OVERRIDE_GFX_VERSION=10.3.0 should work perfectly if you're using the AMD binaries. Or you can build from source for gfx1033, but that's probably more painful than necessary.

The one catch is that the Steam Deck is currently using Linux 5.13, which predates a lot of bug fixes. I was noticing bugs while using the upstream kernel when packaging ROCm for Debian until something like Linux 5.19. My test case for the Steam Deck was rocRAND, which passed most of its tests. I would expect the rest to pass with a newer kernel, though I haven't verified that.

tl;dr: ROCm is not packaged for easy installation on all the platforms you might want to use it on, but it appears that the hardware works and the software exists.


AMD does have two different drivers (which I find maddening). I bought a monstrous GPU to use for hashcat, but the open source/built-in drivers don't support the openCL that hashcat (and crypto mining) need. The Pro driver doesn't get updated fast enough to work on Fedora, which is also a travesty for me. I really wish that weren't the case.

I can't speak for steam deck though, just my experience on a self build.


Have you heard about just-released Mesa 22.3 which now has rusticl for modern OpenCL support? Previously Mesa used Clover for OpenCL, which was stuck somewhere just below OpenCL 1.2 (but only reports OpenCL 1.1 support).


There are two drivers for AMD on Linux. There's the open source driver, Mesa, and there's the proprietary AMD driver, called AMDGPU. I'm not certain which driver Valve is using on Steam Deck (I've never had to look), but if they're funding Mesa, I'm guessing they're using Mesa.

Unfortunately, some of AMD's GP-compute infrastructure is only supported on Linux through the proprietary driver. I believe these are ROCm and HIP. I'm not sure about OpenCL. I'm not 100% sure what Blender uses for AMD acceleration, but if it's ROCm, you'd need the AMDGPU driver instead of Mesa.

Could have some of these details wrong, but I believe that's the gist of it.

You can, of course, install other OS on the Deck.


this is sorta true, but not quite

amdgpu is the underlying open source kernel module -- it's in the mainline kernel.

mesa is the open source userland driver, and runs atop the amdgpu kernel driver

amdgpu pro is the closed source userland driver, which also runs atop the open amdgpu kernel driver.

what's really nice about this is that you can run a fully open source stack in your host os while running amdgpu pro/rocm inside a container.


I believe they're misrepresenting the issue. Blender's Flatpak is possibly broken, but I'd imagine the Steam Runtime version works without-flaw.


Proton is way better than it has any right to be and single handedly made Linux a viable gaming platform. The odds of any particular game working on Linux is higher than Mac now.


Proton is really remarkable, considering there have been several commercial gaming-focused products based on Wine going back many years, and none of them are nearly as good.

Apple has made a bunch of decisions that really hurt gaming on macOS, which is frustrating. Starting with the $99/year fee just for app notarization, which deters tiny indie games from being released on macOS. This has a persistent long-term effect as those same developers make bigger games.

And on the other end of the scale, they're still using Metal as their own modern graphics API, so everyone has to either use a third party compatibility layer (MoltenVK), or use their engine's least-mature, least-well-tested rendering backend. Either way performance is pretty much guaranteed to be worse than it could be.


Yeah one of the reasons I decided to take the plunge and install Linux on my macbook is the compatibility with 32 bit games (risk of rain in particular).


As I have a M1 MBP I am probably going to have to have a separate gaming laptop, or possibly Steam Deck in the future, for quite some time.


Or AsahiLinux might let you keep using your M1.


Work bought me an M1 laptop. I bought a Steam Deck. Paired with my PS5 and Switch, I haven't found a game I _want_ to play that I can't. For AAA titles, I can play them on PS5, indies on the SD, and Zelda on the Switch.


Well, it's standing on the shoulder of a giant. Wine has been going strong for 29 years now, counting from its initial release. What I really appreciate in Valve's contribution is the integration and UX work they have put in it. It's just a breeze to use Steam and Proton, I can hardly imagine it being even more simple. Which, of course, was not the case with the original Wine, although Winetricks, and other projects like Lutris made it a lot easier.


There is no competition. Linux now runs 95% of games, macOS probably not even half that.


Direct link to the interview where this was said: https://www.theverge.com/23499215/valve-steam-deck-interview...


Is Valve the only company in the world that is not universally hated by the HN and reddit crowd?


Another company is Fastmail. Usually loved on HN, not really talked about on Reddit (at least where I see it)


There are many. Mozilla isn’t universally hated. Apple is another.

I’d say Valve comes close to being universally loved here, though, and that is pretty unique.


Apple is a weird one from my perspective. I dislike Safari, but overall I'm pretty ambivalent about Apple in general. But most devs I've met IRL seem to despise them, while HN seems to mostly have a positive view.


Most devs don't develop on/for Apple platforms. HN has a huge amount of SV developers though and FAANG which lean Apple anyway. If you work at Google, you need to make sure your products work on Apple devices, so you probably use a Mac since the dev environment is locked to the platform.

If you work for a shop in Omaha Nebraska making software for surrounding businesses, you probably use Windows.


JetBrains maybe?


Every now and then I see some brain-dead hand-wringing about them being an agent of the Russian state. I saw a lot soon after the war started. The other time I saw a lot was right after that hack that involved TeamCity.

HN and Reddit aren't hive minds and both hold a lot of diverse opinions. Nothing is universally loved on either. Today's society can longer agree on agree on fundamentals like if the earth is round or if democracy is good. So it isn't a huge surprise that not everyone has accepted The Infallible Glory that is the Jetbrains product roster (A little joking in that last bit.... but just a little. All praise CLion)


also probably Tailscale


Cloudflare?


I personally dislike cloudflare for not dropping KF, not sure if that's a majority opinion.


Yeah, even though I'd agree that KF is detestable, the way Cloudflare cracked under pressure was a public embarrassment. It's hard to give them the benefit of the doubt these days.


> Yeah, even though I'd agree that KF is detestable

I am very curious about this tbh. Has anyone here gone to that site? I read that they just report publicly available info that people have voluenteered online. While I understand that doxxing is terrible and so is targeted harassment, are they that much worse than the daily mail or any other gossip magazine?


Not an expert, only checked a few threads shortly before its very end. From what i can tell...

- gossip magazines usually target "public figures". many jurisdictions have relaxed privacy rules around those. KF targeted whom they found... "interesting"

- haven't really read gossip magazines, but i assume they have a higher bar. for example, did any of those print the hunter biden dick pics requested to be deleted in the "twitter files"? cause you could be sure of them to have been on KF, if they had thread on him

- "publicly available info" is stretching it quite a bit. In the case of @elonjet, we that is info required by law/court to be public. KF, on the other side, is more like "stuff that was ever published somewhere" (as opposed to getting new stuff), so again your nudes are fair game for them, no matter any distribution rights


the difference between KF and a tabloid is that KF members go out of their way to interfere in their targets' lives


No, they actually don't. That is against the rules to even talk about. You're thinking about /b/ or something from back in the day, or the people attacking the site.


The larger difference is that KiwiFarms members don't get paid for what they do. That being said, you can be disgusted by something while still accepting it as a valid form of self-expression. Real-life crimes deserve prosecution to the fullest extent of the law, but people being mean online is a shitty justification for deciding to void an entire website. It's entirely Cloudflare's choice to make, but it's value as a virtue signal is vastly outweighed by how many CF customers are wringing their hands now.


Anyone that brags about harrassing people gets banned immediately.


Funnily enough, it sounds like we have opposite opinions.

I'm glad they "cracked" and wish they'd initially dropped KF.


It’s the difference between awful and systemically-awful.


I dislike them for dropping them a week after saying that they won't drop websites that are simply detestable.

Personally I don't care, they are free to have whatever policy they want, but doing exactly what they did one week after writing up a blog post and explicitly stating that they most likely won't drop any websites like that anymore was a bit disappointing.


+1 for Cloudflare


No universal but I'm very much anti valve as they are among the few who continue to do business in Russia.


What would be the point in cutting regular Russian citizens off from video games? It may actually be a good way to communicate with them.


Video games are the modern 'circus' in 'bread and circuses' it provides an opiate for that masses to escape from confronting their current situation and gives the government an escape valve for young male angst. As to communication, have you seen Russianss on the Steam forums? It's all Zs, Russian flags, and ethnic slurs against Ukrainians.


I'm sure you can spot the nationalists, of which there are many. But how do you differentiate them from the normal folk, who just use a non-descriptive nickname and just play the game? Zooming out a bit, in the group of "gamers" as in people who participate in gaming related discussions, you can find many toxic comments, sentiments and people, of which the ethnic slurs, Zs and everything doesn't even stand out.


I can see that, but it's a complex situation and I have counterpoints.

- Young male angst has no shortage of outlets. Sports, both extreme and normal, tons of existing media, pirated media, blowing stuff up, etc. Taking away Steam accounts likely isn't going to turn them into revolutionaries.

- In response the existing comments from Russians, that could be selection bias. Many of them could be commenting normally or not at all. After all, most of what you read on the internet is written by insane people.

- Part of the problem in Russia is extreme media bias/propaganda. Further isolating them from foreign contact and influences can only make that worse and reinforce the narrative that everyone else is out to get them.


So, the users of Steam and what they post on their public pages, is selection bias when judging the usefulness of communicating with... the users of Steam?


Yes, selection bias. Loud people are easy to spot. Lurkers are hard to spot. A loud minority, therefore, can create the illusion of a larger mass of similar folk. Take, for example, the gamergate situation. There was just SO much toxic communication, it was a complete shitstorm, where many were attacked online in a horrible manner. So, gamers sure are a toxic bunch, right? Well, turns out, there are 1.5 billion gamers on PC alone. I'm sure 95% percent of them never even heard about such a thing as gamergate.

https://gamerant.com/3-billion-gamers-report/


So what positive discussion comes of the people that aren't involved in discussions on Steam? None, it's still just a circus for Lurkers. If the reason to keep it open to Russia is communication, that is ONLY happening for those that are visible, not for the lurkers.


Communication is not just the forums, I literally meant everything offered by Steam as part of the communication. The selection of games, the content and presentation of the games, the fact that people play together and communicate in game, either explicitly by voice or text, or indirectly by gameplay. It's all a communication not just of words, concepts, but the ways of living, the ways of relating to the everydays and the world. And that's very useful in not leaving people alone in their bubble, it broadens their perspective, makes them more open to new ideas, tolerate more strange things. How do we know that this works? It's everything that an oppressive regime (and abusive people) works against. Their goal is to isolate, so that they can control the narrative, so that they can solidify their power. So, to weaken their hold over people, you need ways in. Culture, like movies, music, novels, internet content. And like Steam.


Also dont forget its not best idea to be publically against russia when you are living in it.


Keeping Russian teens in a sedentary lifestyle with video games is good for Ukraine. It weakens the Russian military. If anything, they should be giving free games to young Russians. Give them lots of pornography, liquor and drugs as well. These things weaken nations.


>they should be giving free games to young Russians. Give them lots of pornography, liquor and drugs as well. These things weaken nations.

Wow man cool it with the antisemitic remarks.


The best pirated game scenes are Russian, Valve exiting the market world mean very little. It's also a huge benefit to the rest of the world that Russian/ex Soviet culture has both very little respect for Western intellectual property laws and the technological abilities to make software and digital media easily available.


If Russia is where you draw the line boy do I have news for you! If your standards were universal, you wouldn't be doing business with any company at all.


As a hippie that lives in the Rocky mountains, why do you assume I do business with large corporations? And I definitely feel let down by Steam's take to continue to provide entertainment for the masses so that they don't have to contemplate what their country is doing. They are the circus in bread and circuses. But yes you are right, I should just be paralyzed from basing my opinion of any company because 'other companies, other countries'. What a productive take that would be.


I just need to point out these 'it's pointless to do anything' people are wrong. My small mountain town's 1Gb fiber ISP is local. Our power company is a co-op. The gas station I use is a co-op. My limited meat consumption comes from an organic Mennonite family farm. Summer months my produce comes from a local CSA and I'm lucky to have a great local flower CSA as well. You can actually live in the modern world in comfort AND choose how you live, even in the USA.


Local co-ops are the opiates of the hippie masses. It keeps them plump and happy in their little faux communes and away from the fight against corporations, state and federal government. As someone who lives 5000 “miles” away I feel really strongly about judging these hillbillies/rockabillies that I’ve never met and I for some reason feel that they should be inconvenienced so that they can take up the fight against their government, which after all is the biggest imperial power on the planet.


Where do they get their gas?


Especially with entertainment, I haven't understood the anti-Putin sentiment. Is it really in the West's interest to stop exporting soft power there? Sanctions make sense to me, but this particular industry we should make an exception for, I'd say. Authoritarian regimes try to isolate their populace anyways, it'd be a shame to co-opt that.


Yes, let's ensure we continue to give young Russian males an outlet to escape from thinking about the current realities in their country. Steam is providing the modern circus in 'bread and circuses'. If anything, this is the exact soft power we should leverage to highlight to those your angst filled males that their current reality is isolating them. Or, we could just give an opium of the masses to keep them indifferent to reality in their rooms because 'soft power'.


You can be sure that if we don't give them outlets, China will. Do you think nobody would fill the vacuum? If they'd ban every media import, that'd just be an opportunity for the regime to further solidify their own goals. For examples, look at what happened when McDonalds, Coca Cola left the country - they have the same shit, but now without the influence.

Whatever there is, there will be abuses of it, so yeah, many would just use the "opium" part of Steam. But there's another aspect of it that would still get imported, and that's any of the narratives that differ from the propaganda they are subjected to. With Steam specifically, they'd be exposed to titles like This War of Mine, and international communities where they'd experience that nobody gives a fuck about them being Russian, or even find it cool, despite the prevalent anti-Western propaganda in their own media.

And that all, I think, is worth more than sanctioning it.


>Steam is providing the modern circus in 'bread and circuses'

Also, you know what really provides bread and circuses? Populist bullshit such as "Putin banning LGBT propaganda". Nothing better to take the minds off of pressing issues like further harming a marginalized group.

https://news.yahoo.com/vladimir-putin-bans-lgbt-propaganda-1...


As a whole, Russian nationalists were popping champagne corks when Western media and other influence operations left Russia. It was basically everything they wanted. Given to them for free. By their enemies. Everything that cuts a dependency also cuts an influence vector and increases sovereignty.

The world is moving to regional blocks -- a Western block of North America and Europe, a Eurasian block of Russia/India/Iran/Middle-East/China, and the loyalties of Africa/Latin America are where the wars for influence will be fought, with Africa tilting toward Eurasia and Latin America tilting toward the West. Other battle grounds would be marginal areas like Southeast Asia/Pacific Islands and in Europe the balkans. In that context, it makes no sense for the Eurasian block to allow a business to operate domestically from an enemy block. Why on earth would you let them influence your population? For the same reason, Eurasian influence operations are being banned in the West, from Confucian Institutes to Sputnik News.

The main entrypoints to a block will be those nations that have sufficiently rigid control over their domestic populations to feel free with enemy dependencies -- that would be the US and China, both with impressive soft power and sophisticated information control regimes. That is why US made films that are shown in China often have to have different scenes in them - sometimes shot with different endings -- and there are strict limits. China has their own system of deboosting content and their own troll farms. Same for any kind of internet presence and also video games and other media. Likewise, the US has a sophisticated system of banning and other soft "deboosting" of views that run counter to official regime narratives. There is a whole industry to identify counter-narrative views and brand them as "misinformation" that must be rooted out. Russia has a much less sophisticated information space defense system for this, and much less soft power, so for them outright bans and blocks are the only viable option, and this was given to them by the West on a platter. Amazing self-own.


Punishing people living under totalitarian rule might not be a good idea.


Reminds me of that joke where the worlds entire computing system is perched upon a tool by a guy that lives in the middle of Nebraska.

Seems the field of people that could get this stuff working (some really awesome hacking) is so small... or rather, breakthroughs come from a few individuals.


And as a result they have an amazing cross platform product


i'm a big fan of companies paying and/or hiring OSS devs to continue working on OSS code.

(100% biased :D)


There's nothing wrong with liking things. Go open source!


This is one of the main reason to why I buy all my games on steam.


I continue to be terrified that some day in the future, in a moment of weakness, Valve will allow itself to be swallowed by Microsoft…


I find it pretty unlikely, both because of Gabe, but also Valve prints money and is private. They don't have anything to gain by being bought by microsoft.


Understandable concern, as documented here: https://www.youtube.com/watch?v=H27rfr59RiE


There's 0 chance of this happening at long as Gabe's around.


Yeah, problem with that is that Gabe is human and won't live forever. I worry about what will happen to one of the greatest forces in keeping PC gaming not only alive, but relatively open and free from bullshit, when Gabe isn't around.


Strong chance of moving to 100% piracy forever if Gabe, or his replacement, bends the knee.


One of the reasons I love Valve.


Somebody needs to step up to save Clang now that Google has more or less abandoned it.


Clang will be fine. It's based on LLVM, which will be actively maintained for as long as Rust exists, and Apple relies on Clang's functionality for pretty much all of their modern tooling. There's enough stakeholders to keep things moving smoothly, at least as far as I can tell.


I don’t have any worries on LLVM, since it has become probably the biggest compiler backend with so many languages and tools depending on it. The problem lies within the Clang frontend, where people are worried that it might not catch up to the latest C++ standards since Apple and Google are now looking elsewhere to replace C++ (Swift and Carbon, respectively).

Clang not supporting new C++ features actually have some quite far-reaching consequences, since editors nowadays heavily rely on clangd for autocompletion and static analysis, and they too will be stuck past in time.


Google uses Clang for much of its production code and employs people to work on it. I don’t think saying that they’ve abandoned it is accurate.


It's been a (very) long time since I worked in Android OS land but at the time I could see effects of them actively moving projects from GCC to Clang. I always picked up the outsider impression that it was their preferred C++ toolchain. First I've heard of them abandoning it


There is a big push for their proprietary language "Carbon". Clang is way behind on implementing most C++20 features.


That's an early stage, pre-production, experimental language. They are still a couple years out from a 1.0. They aren't moving away from clang for that anytime soon and they sure haven't abandoned clang for it.

Also it isn't even proprietary[1]

1. https://github.com/carbon-language/carbon-lang/blob/trunk/LI...


Well, but I’ve heard that many of the Clang people from Google have switched their gears to work on Carbon, so that might be quite an issue for future Clang development (even if Carbon is in the early stages).

I think the possibility is that the Google engineers will do bugfixes for up to C++17 features but will minimize work on catching up to the latest standards (since it really takes up too much engineering effort, and Google seems to be conservative in using newer C++ features anyway and have opinions that differ from the standards.)


> Clang is way behind on implementing most C++20 features.

https://en.cppreference.com/w/cpp/compiler_support#cpp20 begs to differ. The heck, even C++23 core language support is almost fully done.


Well, let this be a lesson to anyone who tries competing with a GPL product. Permissive licenses are cool up until the moment a FAANG company embraces your project, extends it and extinguishes it.


I hope you are right.


Apple and FreeBSD use clang as their main C compiler, no?


I assume you're referring to Google pushing the Carbon language? That uses LLVM as the backend, so they're basically adding more resources.


GTA5 still crashes on an i7 with IrisXE graphics, latest Ubuntu, and latest Proton or Experimental. Hope they manage to fix it.


That's not news. We have known that for years.


News flash. This is not a news site.


Weird, I could have sworn I saw a news flash on it somewhere.


Hacker News…


You should check out the economist, you’d be surprised it’s not all economics! Also those gosh darn New York Times newspapers don’t solely focus on NYC. Wtf!

Don’t judge a book by the literal interpretation of its title?


Strange, I could have sworn that 99% of the things posted here are news and commentary on news.


I don't know... i frequently see dang linking to old posts of the "new" repost.


Lol most posts here are certainly not news.


HN usually suppresses real "news" bc it is often divisive. They are cool with informative stuff like tech blog posts but not "News" as in recent events.


They would do better by sponsoring native GNU/Linux games, or even ports from other POSIX like platforms like Playstation/Android, instead of playing OS/2 with Windows games.


Nah. Win32 is the only stable userspace ABI on Linux, so I'm glad they're supporting it. There's no point in supporting native ports of games that simply won't work anymore 4 years from now because everything else changed.


I try to purchase games on GOG where I can. You can purchase games with a Linux build that fail to run on a modern system. Meanwhile the Windows version will continue to be runnable in a VM/Wine. Frustrating when there is a Linux release available that I cannot run without some kernel kung-fu.


In another words, there is no money in GNU/Linux games.


Exactly. I played life is strange 2 recently. It has a native Linux Version. On startup it complained I should set my CPU governor to performance instead of governor. Then it said it only supports AMD and Nvidia cards. I pressed continue and the system hard-locked. This machine is about a year old today, and this is the only crash I had, ever. After that I configured steam to use the windows version instead, and I could play through the entire game flawlessly.

So, at least from personal experience yes, it makes absolutely no sense to port to Linux, if the result is still worse than proton.

Regularly test under proton during development, and you got a Linux Version of your game for free. From my experience, epic already took good care to make sure unreal engine works well under proton, so if you just pick that and don't add a ton of additional stuff that uses esoteric winapi stuff, you're already set.


Source ports to Linux are actually pretty rare. Most big games with a 'native' Linux version are non-native in exactly the same way as Proton ports— they run using a custom compatibility layer based on WINE or something similar. Often the port is of such high quality that you wouldn't be able to tell (Feral Interactive has done a ton of great ports this way), but this is still how most are done, especially for AAA games.

(In that sense, developers (or Linux porting studios) switching from their in-house compatibility layers to Proton isn't really a change in terms of those ports being 'truly native' or whatever.)

I wonder if what we're starting to see, rather than the inferiority of source ports, is actually that some of these alternative compatibility layers are just no longer keeping up with Proton, since Proton/WINE has such wide use/testing and rapid development these days.


Hence "OS/2 runs Windows better", all over again.

We all know how that ended.


That was in an era of rapid change for Windows APIs, these days games target Windows from ~5 years ago.

Another comparison that I think is more apt: "IBM PC Compatibles run IBM PC software better"


I don't see too many parallels there tbh, so genuinely no idea how this will play out in the long run. But it's pretty exciting to watch.


Why do you care? You really need to get over the "Linus said something bad about my mother 20 years ago" or whatever your issue is. It's just weird at this point.


You are the ones that need to get over the fact that these are Windows games, not Linux.


These games are targeting an ABI that happens to have come from MS.

My fucking linux machine runs on intel hardware that is targeting an arch made by AMD (x86_64).

There is no problem with sharing stable interfaces. The idea that we should not treat these as linux games because they consume an interface developed by MS, that is implemented just fine by proton on linux is... very very backwards.

Who gives a flying fuck who made the interface, as it can be implemented across platforms, I'm a-ok with the result.


Nah, backwards is jumping of joy by only having games via "emulation" of other platforms, regardless of how it is implemented.


This is almost intentionally obtuse at this point - and you're getting downvoted for good reason.

You don't have a compelling argument here - step back, pause, and consider whether you're having an emotional reaction.

Your argument is basically "Developers should do things I want, for my benefit, with no regards to the success or consequences of those actions".

And that's a pretty fucked up argument.

A much better argument is: Proton makes linux a compelling platform for gaming. There is a decent chunk of market segment that keeps Windows around only for gaming. Allowing those users to move entirely to linux increases the appeal of developing native gnu/linux games, because there is a much larger native market.

This is how interfaces work - they allow you to develop new products that are compatible with the existing userbase. It's why they're incredibly important as stepping stones.

---

Side note - Proton came out of codeweavers, codeweavers develop a custom version of Wine. Do you happen to know what Wine fucking stands for? hint:

Wine Is Not an Emulator.


What does 'getting over' even mean here? These games are coded to APIs provided by Windows; but there's nothing stopping another platform from implementing these same APIs and happily running these games on not-Windows.


They are not Linux games...


They run on Linux, which is close enough for me.


What is there to get over? They run better on Linux most of the time, plus I don't need to groan when Microsoft asks me to enable the Windows™ Gaming Hub™.


Great, I guess we can forget about SDL and friends, and start adding a note "Want this on Linux? Ask Valve".


They wrap to SDL, so it's still being used. Proton is mostly an assemblage of FOSS projects, direct interaction with Valve isn't needed to use those any more than direct interaction with, say, a major contributor to Mesa like Intel when using that.


Why? Native SDL apps work fine on Linux too.


Nobody. Cares.


Indeed, nobody cares about 2% market share.


They actually did in the past, but they've realised that many game devs can't be trusted to maintain Linux builds as a second platform. They're often left outdated, require dependencies they didn't ship with or simply have worse performance than the Windows builds running on Proton.

These days they check both builds if available and suggest the devs which one to use by default; often the Windows builds over Proton are simply better than mediocre Linux support.


The same devs that most likely are also shipping Playstation or Android versions of their game....


What does this have to do with anything?

Shipping something on Android does not guarantee functional builds for regular Linux, and PlayStation uses a heavily customized FreeBSD fork. Neither of which are helpful for much of anything.


They are closer to Linux APIs than Windows ever will.

Android even more so, assuming those that rely only on AGK/NDK.


But they don't rely only on the NDK because there are quite few basic APIs that are Java only


Orders of magnitude bigger markets.


And actually stable ABIs with good documentation.


That's not really true, native Linux games are a much harder sell for publishers because the proportion of users is just too low.


Hence why Valve should make the needful to change that, instead of "emulating" Windows.


They tried earlier on before they realized this approach made way, way more sense. Even the native linux versions of games that have them often run worse than the windows version wrapped in Proton. Developers just aren't going to commit to making native linux versions for the most part and that isn't going to change. This is reality.

The magical thing is that they've done so much work with Proton that a device like the Steam Deck is more than a curiosity. It is a viable product. It is viable to play games on Linux in a way it has never been before. Even better, they aren't keeping it to themselves.

A purist ideology about linux isn't going to change reality. But Proton is. The Deck is a wildly popular device and the knock-on effect is going to be more systems running linux for regular people. The games may not be native but if they're indistinguishable from native games, then who cares? Meanwhile adoption of linux as a viable platform still goes up. Maybe on a long enough timeline things swing around, though I still very much doubt that.

The Year of Linux on the Desktop is still a fantasy, but reading the steam hardware survey, it is more true than it has ever been before, thanks to the Deck and Proton.


Yeah, a hooping change from 1 to 2%, to another 20 years.


Think embrace extend and extinguish...in reverse!

1. Make wine really really good.

2. Start adding Linux-only features.

hehehe


This would only have a chance of working if they dramatically improved performance over native I think.


Those Linux only features don't matter because games are targeting Win32.


the DXVK project out of all this has been a godsend for the older DX11, DX9 games on modern hardware, its being used to increase FPS on older titles on windows


Isn't Intel even shipping DXVK on window for their shiny new GPUs as it is often works better than their DX9 drivers?


Yep. Their driver switches which DX9 implementation based on which game is running, according to which they've observed has better performance.


The promise is that they start targetting Proton or devices like the steam deck in particular.


They already do that by targeting Windows, and letting Valve do the work, no need for more development costs.


That's exactly what proton is for: bootstrap a linux gaming ecosystem without having to port thousands of games


That only works if studios ever bother to write native Linux games, otherwise is like being happy MAME supports Linux.


Proton is a platform.

If Valve gets gaming studios to target proton and guarantee compatibility, is a win.

If that's equivalent to gaming studios explicitly targeting MAME and QAing it, I'll take it.


They already target Proton by writing Windows games, no need to bother with Steam.

Anyway it hardly matches Switch sales, for which studios do actually bother to actively support.


It looks like this is getting a lot of hate, but I think there's clearly some truth to it. As an initial effort to bring a huge library of games to Linux, Proton is brilliant and a huge achievement. But it does feel less secure, in the long run, than having developers writing games for Linux as a first-class platform from the start.

Idk what the transition plan could look like, but eventually we should hope to see the Steam Deck treated like any other console with highly optimized, native ports, and then hopefully some of that can be reused for 'more native' general Linux ports.


It isn't less secure, win32 programs have just as much as access as native binaries do. Valve also runs Proton in a container.

Not to say the current implementation is secure though, only that the use of Wine/Proton makes no difference.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: