From a performance and technical perspective this is incredible. Well done!
It will never happen, but my dream is for the Asahi devs, Valve, and Apple to all get together to build out a cross-platform Proton to emulate and play games built for Windows on both x86 and ARM hardware running Linux.
A Steam Deck with the performance and power efficiency of an M-series ARM chip and the entire library of games that run on Proton is just...dreamy.
I hope they don’t ignore macOS. They could port proton to run on Metal + Rosetta and then Steam would support all running windows titles on macOS and Linux.
I recognize it’s a hell of an ask, but I think Alyssa could pull it off.
I'm a huge macOS fan but even I have to admit that gaming deserves to have a home on an open-source OS controlled by no one (which would be Linux in this case) and that Steam devoting effort to bring first-class gaming from Windows to macOS is like working to escape from the frying pan into the fire.
I see cats writing software _with hardly any documentation_ like this and my brain melts. Power level strictly monotonic, possibly exponential, and very very large. Bravissima!
I hope that she was working on Asahi full time at valve, because if she did all that while also holding down a day job then I might as well be soylent green.
A lot of stuff like this shows up, they also have a fork of waydroid and box64. I think a lot of them are projects and a lot of them are just devs with a lot of agency who share the dream
Steam Deck was made possible by their ongoing efforts to enable the play of most of their games catalog on any hardware platform that is computationally capable of running them, regardless of OS or architecture.
The end game for Valve isn't Steam Deck 2 or 3 (which is statistically impossible for Valve to produce), but for Steam to be on everything.
Steam Deck was made possible by the plethora of the Windows games developer market and Proton.
Most of the studios that own those games, and target POSIX like OSes on mobile phones and game consoles, are yet to bother with GNU/Linux versions for SteamOS.
Wine and DXVK are already running on Android and they play Windows games with the rendering and computational complexity of Fallout 4 at playable framerates on many of the latest smartphone SoCs. It's still WIP, but it's already gone beyond proof of concept, people are using them. Valve don't need the developers to be on-board in order to run their games on anything else, that's why Proton exists.
What Valve want is the dissolution between platform/architecture and store. By my eye, it's the driving force of their efforts, more so than them selling hardware or being the open source good guys. Not to undervalue their work in helping make Linux a first class citizen for gaming, but the core of their business model is getting people to engage with their store, full stop, and being able to sell their games on Android (and elsewhere) would massively extend their reach.
This may go both ways too, there's also been indications that Valve have been tinkering with Waydroid, meaning Steam could also become a store for Android-native games.
That's a small part of it, I think. They've almost certainly spent a lot more on pouring time and effort into Linux than they ever would have saved on license fees. It seems like Valve doesn't want to be beholden to Microsoft in any way. They support Windows because that's where the users and the games are, but they don't want Microsoft to be able to rug-pull them either.
As far as I am aware, neither Valve nor the independent Wine/Proton developers are bound by any licensing agreements with Microsoft. They are clean-room implementing the same technologies, but they are not beholden to Microsoft in any legal way. Of course, drastic changes in laws or policy regimes could alter this dynamic, but those are out-of-context risks.
In order for Microsoft to rug-pull the technology (which is quite different from rug-pulling the business model), they'd have to break compatibility on Windows itself. Video games remain a major reason for home users to run Windows. Making ABI-breaking changes to Win32 or DirectX is just not very likely to happen. And if it did happen, it would be a boon to Valve and not a harm.
The biggest risk (and this would be a classic Microsoft move, to be fair) I can foresee is aggressive API changes that make it hard for Valve/Wine/Proton to keep up but also make it hard for game developers not to. I'm not exactly sure what this would look like, and a lot of the core technologies are pretty stable by now, but it's a possibility. It's not, however, going to harm anything that already exists.
They can restrict .NET and the C++ stdlibs so that you can only run them under Windows (through a license change or by introducing a code check), but if they hadn't done that in Ballmer days, I don't think they will now.
I don't remember how officially it was stated, but original push by Valve for steam on Linux with Proton was to remove their dependency on Microsoft - a hedge against possible future ecosystem-impacting decisions in Redmond.
Making SteamDeck use windows wouldn't impact prices much, Microsoft is really friendly for putting windows by OEMs. Could even run modified to act like current steam deck.
Instead, SteamDeck is there to drive up testing on Proton or straight forward porting to Linux, which just availability on Linux and the previous steam machine didn't drive up
Close, but the root is more nuanced. Once upon a time, Microsoft was talking about regulating "apps" on windows like Apple does for the iPhone. Valve saw the writing on the wall: a potential ban on violent or otherwise adult-themed games. So Valve started the steambox project. Get the games running on linux/WINE and they could tell Microsoft to push off. Years later, we have the steamdeck as a revolutionary product and linux is the go-to OS for portable gaming.
If future versions of Proton break compatibility with older Windows apps, you can use different old versions of Proton for individual games. Steam makes this very easy on Linux, but rarely is it necessary.
I don't foresee many Linux distros breaking compatibility with Wine, which is good, as some devs argue Win32 is the only stable ABI on Linux. [1]
I don't foresee legal issues either, as Wine has been around for 31 years, and its corporate sponsors have included Google in the past. I've seen no indication that the project is on shaky legal grounds.
Microsoft could always create a new API that Wine doesn't yet support, but good luck getting developers to use it -- they've tried many times, but not much has stuck, and most devs just stick with Win32. [2]
Linux/Unix has been used as a base overwhelmingly in pretty much every new consumer OS for decades. Not to mention Microsoft certainly cuts deals with manufacturers when it comes to windows on portable devices (I think at one point they offered free licenses on devices with screen sizes under 8 inches).
The steam deck is 100% usable without leaving 'game mode' even a single time. Something that is genuinely impossible using Windows as a base. That's the important part
The amount of money Valve has pumped into Linux would have far exceeded the money they saved through licensing Windows. Like probably by an order of magnitude or more. For someone as smart as you seem to be, your points don't make a whole lot of sense.
I am a strong linux supporter and I too do not like what proton is doing to games. A few years ago there were many significant games coming out with native linux capacity (MineCraft, KSP, Factorio). Then proton dropped. Now, rather than support linux natively, even the most pro-linux developers are just expecting that their windows version will run under proton. And those who are running games under proton are essentially cut-off from customer support. I've had a few games where a patch suddenly stopped them working under proton. I have no recourse in such situations. That is not a good trend.
Genuinely what is the practical difference in this for 99% of users? They just want to play x game. Proton performance is pretty great, what else would be a problem for those people?
Also when it comes to breaking proton support (Which does happen) Valve + GloriousEggroll give you access to plenty of older and special versions. Surely that's better than rolling back entire software?
My game doesn't work ->
I go to protonDB ->
Users saying use X Proton Version or Y ProtonGE version ->
I switch the layer used in steam
Linux is not a stable runtime in the first place. Unless you are isolating, redistributing and sandboxing most of the libraries used to run your game, it's almost guaranteed to break when the dependencies are updated. Windows apps don't have that problem, natively or when run through emulation.
Just ignore the guy, it's essentially the reverse of "I run Arch BTW". Not just about Valve or Proton, but about pretty much every FOSS technology that's celebrated here.
It doesn't have to do anything for GNU/Linux games, that's been an option for years and it's a ghost town a-la Metal-native games. Valve (and the community) are doing the right thing by ignoring the Apple strategy of enforcing distribution terms they will abandon within the decade. Developers that want to program for Linux still can. It's just as stupid as it was when the first Steam Machine rolled out.
By supporting Proton, they are guaranteeing that modern and retro Windows games will be playable on Linux far into the future. Trying to get the next Call of Duty to support Linux natively is, quite literally, a waste of everyone's time that could possibly be involved in the process. I cannot see a single salient reason why Linux users would want developers to release a proprietary, undersupported and easily broken native build when translation can be updated and modified to support practically any runtime.
Yea. You either have to pump a ton of money into it like Apple tries to do to get devs to target your OS, or you can take matters into your own hands and do the unthinkable with Wine and Proton. Its unironically a silver bullet solution. Otherwise we'd all be waiting for years to make 1/1000th the progress
We don't have to imagine what Linux gaming would be like without Proton.
- CD Projekt Red: released Witcher 2 on Linux, didn't for Witcher 3.
- iD Software: released Doom 3 on Linux, didn't for Doom (2016) or Doom Eternal.
- Epic Games: released Unreal Tournament 2004 on Linux, but didn't for Unreal Tournament 3 or Fortnite. (A Linux port was being worked on for UT3, but it ended up getting cancelled.)
- Larian Studios: released Linux version of Divinity: Original Sin, didn't for Divinity: Original Sin 2 or Baldur's Gate 3
Many studios over the years have made native Linux versions, and many studios stopped because the cost of porting exceeded the revenue it generated. Proton didn't exist when Unreal Tournament 3, Witcher 3, Doom (2016), or Divinity: Original Sin 2 released, so Proton wasn't the reason those studios stopped developing Linux titles -- they stopped because it made no financial sense to continue to make them.
Now, with Proton, 79% of the top 1000 games on Steam are gold or platinum rated on ProtonDB. If you're fine with minor issues, 88% are silver rated or better. For the Steam Deck in particular, there are 5,500 verified games, and 16,526 verified or playable games. So I'd argue Proton is doing quite a lot for people gaming on GNU/Linux machines, since they now have access to a solid majority of the top 1000 games on Steam, both on a Linux desktop and on a handheld.
The practical implication is that one can click one button and buy install and play thousands of games on Linux. Only MS stockholders are liable to care about the implications for Windows.
OS/2's Windows compatibility was borne in the midst of Windows' rapid ascension, of rapid progress and change in the home PC industry.
We aren't in the 90s anymore. Win32 has stalled, Microsoft has a regulatory gun to their head and Wine's compatibility (at least in the domain of games) is extremely good, good enough to allow for a commercial product to be a success while being entirely reliant on it. In what way is any of this comparable to what happened with OS/2 outside of "it runs Windows applications"?
You are right that many developers don't care and haven't bothered with Linux, but one reason for optimism is that this seems to be changing. Just looking through the list of native Linux games today compared to what it was like a year or 2 years ago, there are a lot more options. I was looking through the list of Linux games on Gog, and it is likewise in a much better position than it was prior. I think there is much reason for optimism!!
It would have to be, CS2 is the fourth major installment in the Counter-Strike franchise.
Then again, all kinds of companies take liberties with naming including numbers. Look at Windows 7 (12th major release), Windows 10 (successor to Windows 8), the game Battlefield 2 (third in the series), Battlefield 3 (three games after BF2), Battlefield 1 (after the release of BF4), etc.
It is a joke about Valve only rarely making the third installment in any of their series (e.g. Half Life 3 not existing)
Edit: specifically including "3" in the title - the actual number of games tends to be higher but with different names.
I believe Valve dropped official Windows 7 support in Steam because Chromium did and they weren't going to fork it.
I empathize if you don't like any version of Windows newer than 7 or XP, but it's time to let the dream of running them forever go. It's not weird when software doesn't support the 2009 version of an operating system anymore in 2024. If they never dropped support, it would be difficult to take advantage of improvements that occurred in the last 10 years, because we'd forever be stuck in baggage.
Of course when it's feasible everybody loves software that really never does drop support, like 7-zip, which I think happily still works on Win9x without KernelEx... but I'd rather 7-zip stopped having serious security issues than continued to work on old Windows versions.
Windows 7 was great and I'd love to go back. If I really had my druthers, Windows 2000 was peak and XP was just a vulgarized version of 2000.
However, it is Microsoft more than anyone else that has decided to stop supporting those operating systems. Windows XP does not have support for any modern version of TLS (only TLS 1.0). There's no good way to support a browser-based app like Steam on a platform that cannot natively provide a secure connection to a modern web server.
There is not such a hard reason to drop Windows 7 support (again, except that Microsoft no longer supports it) but there are security-relevant APIs that are only available starting in Windows 10 which means special patches would have to be maintained just for Steam on Windows 7 to continue working securely.
Apple is going to do the same thing they did with BSD, WebKit, etc. They will wait until proton is mature enough, fork it, then release it as their own. Why put in the effort this early on?
Which is exactly what I described. Looks like they took crossover/wine and added some custom patches. What are the chances they upstream anything? Probably 0.
Not sure what agreement they have there, but at the end of the day it’s Wine which has decades of open source development behind it at this point. Plus a bunch of other libraries (gstreamer being a notable inclusion) that are all FOSS. This still fits the pattern of Apple profiting off of OSS projects while contributing back as little as they can get away with.
A non-trivial number of contributions to Wine come from CodeWeavers (30%+ of all commits), which in turn is funded by its work on Crossover, Proton, and commercial agreements with other businesses. Wine would not be the project it is today without the contributions of CodeWeavers. Contributing cash to the companies contributing code is a perfectly adequete form of giving back.
CodeWeavers released an annoucement when Gaming Portal Toolkit was announced.
The announcement says practically nothing except “we did not work with Apple on this project.” And then a bunch of comments about the license Apple gave their version of the source code.
You sure there was any kind of commercial agreement? Doesn’t look like it.
I think it's still hilarious and crazy that safari/chrome/webkit/blink exploded out of the cute little KHMTL browser called Konqueror in KDE from back in the day.
And the root of the whole browser wars thing was microsoft making an absolute dog of a browser for Mac OS X when it came out and then refusing to support it. lmao.
Haaa Konqueror. It was THE shit back in the day. I loved this software. It really was at the core of the KDE experience. Too bad it disappeared, I miss it. (well it’s not technically dead but it’s not moving either)
I came here to say this! Konqueror may have served a small community but it was excellent.
It was the file manager as well as the browser and it was incredibly capable. By far the most advanced GUI file manager of its time. And a pretty fast and pleasant browser, although the compatibility was hit and miss. (Those were Flash and IE-dominated days as I recall them.)
A lot of what I loved about Konqueror is captured in Dolphin. I don't think I need my web browser to be a file manager... maybe that concept was just a 90s fever dream. But I miss Rekonq. Maybe I should revisit Konqueror.
I miss Dolphin badly whenever I'm on non-free operating systems, even though I generally enjoy file management via the terminal as a fallback. In Plasma, I'm much more likely to do a bit of GUI file management than under any other circumstances.
'Default' KDE apps are often so well thought-out and complete that I never feel the need to deviate from them, and it's not unusual for me to install them on other operating systems when possible. I feel this way about Dolphin, Okular, Ark, Kate, Gwenview, Klipper, and Konsole/Yakuake, too (even though there are several great new terminal emulators out nowadays). And KWin! God, KWin's configurability is so good and it has some really killer compositor effects for productivity that are still unmatched.
IE 5.5 for OS X was by far the most standard compliant browser of its time. It supported more CSS than either NN 4.x or IE 5 for Windows. Nothing came to surpass it until Mozilla 1.0 and even then it wasn't a slam dunk.
Was gonna say IE5 on OS X was the opposite of “a dog of a browser”. It was the gold standard against which every other browser was compared because it was by far the most standards compliant browser of its day.
Also a quick correction, there was no IE5.5 for OSX. That was for Windows and used a diff rendering engine.
Usually I would be as optimistic as you are about this, because that would be the dream (although it would be nicer for them to contribute to the project.) However, given Proton's primary use case is gaming, such an effort will almost certainly be kneecapped by Apple's historic half-hearted commitment to anything other than microtransaction-powered mobile games.
No but it looks like they’ll add Metal support to Wine, do the bare minimum to comply with the license and release it as “Apple game toolkit”. Textbook.
An ARM CPU only emulating x86 isn't going to be more efficient than straight x86. ARM is barely more efficient as it is at those performance levels.
The real reason Apple is ahead is because they're paying for more expensive more advanced nodes for their CPUs. I you compare CPUs on similar node sizes, you'll see that AMD and Intel are basically caught up architecturally in perf/W metrics.
The last I checked, there low wattage AMD and high wattage Apple had similar performance and wattage, so AMD was the right choice for raw performance, and Apple won for portable devices.
Intel was losing badly to one or the other at all TDPs. I don’t get the impression that’s changed much. (Even if it has, I can’t remember the last time I encountered a non-xeon intel machine with working hardware and drivers (for any OS, and I tried Windows, Linux and macs).
AMD does not win over Apple for raw portal performance. AMD does give you a lot of MT performance for a decent price. You have to buy an M3 Pro to match an AMD HX 370. However, you’re sacrificing battery life, quietness, portability, and ST performance with AMD.
Yes, M3 is still 2-3x more efficient than Lunar Lake’s GPU as long as it’s not games that were more optimized for x86 and or DirectX. For example, M3 generally wins in compute with a lot less power needed.
that's ideally what Vulkan was for. Build for one common open source standard, and then Apple/Microsoft/Google/Linux can all build an API to support that.
But I guess there was never a time when an open graphics standard stood as the leader. Maybe during a brief stint in the Windows Vista era at best.
Arm has had a performance-to-power edge over x86 since inception.
AIUI, if you want the most flops per die, you'll buy x86 - probably the 128-core Xeon for enterprise money. But that's not what's best for hand-held gaming.
AAA titles are typically GPU-bound anyway. More CPU flops may not offer much benefit.
> AAA titles are typically GPU-bound anyway. More CPU flops may not offer much benefit.
Yes, but actually no. The Steam Deck is playing at extremely low resolutions. Rendering at 720p and 30fps is (on paper) 8x less demanding on the GPU than rendering a native 1440p60 experience. You can fully get by without having a powerful dGPU, which is why the Steam Deck is really able to play so many titles on a weak iGPU.
The problem is translation. Cyberpunk 2077 runs fine on a 25 watt mobile chip that uses x86, which is why the Deck even costs less than $1000 in the first place. If you try to put a mobile ARM CPU in that same position and wattage, it's not going to translate game code fast enough unless you have custom silicon speeding it up. There's really no reason for Valve to charge extra for a custom ARM design when COTS x86 chips like AMD's would outperform it.
For x86 PC games (which pretty much all games are, today), ARM is at a substantial disadvantage, period. The IPC and efficiency advantages are entirely lost when you have to spend extra CPU cycles emulating AVX with NEON constantly. If there were ARM-native games on Windows then things might be different, but for today's landscape I just don't see how ISA translation is better than native.
Yes, there would have to be a push in the industry to port games to ARM, otherwise the gains in architecture will indeed be lost in instruction translation
The reason why this isn't more prevalent is twofold
1) everything standardized, like it or not (note: I do not), on the Windows API, and it has remained relatively stable, which is important because
2) Linux-native games I've had, have become un-executable over time without semi-regular maintenance, and Windows games running on whatever version of Proton they best work with do not have that drawback
> Tessellation enables games like The Witcher 3 to generate geometry. The M1 has hardware tessellation, but it is too limited for DirectX, Vulkan, or OpenGL. We must instead tessellate with arcane compute shaders
> Geometry shaders are an older, cruder method to generate geometry. Like tessellation, the M1 lacks geometry shader hardware so we emulate with compute.
Is this potentially a part of why Apple doesn't want to support Vulkan themselves? Because they don't want to implement common Vulkan features in hardware, which leads to less than ideal performance?
(I realize performance is still relatively fast in practice, which is awesome!)
> Is this potentially a part of why Apple doesn't want to support Vulkan? Because they don't want to implement common Vulkan features in hardware, which leads to less than ideal performance?
Yes, it's a big reason.
I tried to port the yuzu switch emulator to macos a few years ago, and you end up having to write compute shaders that emulate the geometry shaders to make that work.
Even fairly modern games like Mario Odyssey use geometry shaders.
Needless to say, I was not enough of a wizard to make this happen!
Are you talking about Vulkan or about geometry shaders? The later is simple: because geometry shaders are a badly designed feature that sucks on modern GPUs. Apple has designed Metal to only support things that are actually fast. Their solution for geometry generation is mesh shaders, which is a modern and scalable feature that actually works.
If you are talking about Vulkan, that is much more complicated. My guess is that they want to maintain their independence as hardware and software innovator. Hard to do that if you are locked into a design by committee API. Apple has had some bad experience with these things in the past (e.g. they donated OpenCL to Kronos only to see it sabotaged by Nvidia). Also, Apple wanted a lean and easy to learn GPU API for their platform, and Vulkan is neither.
While their stance can be annoying to both developers and users, I think it can be understood at some level. My feelings about Vulkan are mixed at best. I don't think it is a very good API, and I think it makes too many unnessesary compromises. Compare for example the VK_EXT_descriptor_buffer and Apple's argument buffers. Vulkan's approach is extremely convoluted — you are required to query descriptor sizes at runtime and perform manual offset computation. Apple's implementation is just 64-bit handles/pointers and memcpy, extremely lean and immediately understandable to anyone with basic C experience. I understand that Vulkan needs to support different types of hardware where these details can differ. However, I do not understand why they have to penalize developer experience in order to support some crazy hardware with 256-byte data descriptors.
I’m not a game programmer, so I just sort of watch all this with a slightly interested eye.
I honestly wonder how much the rallying around Vulkan is just that it is a) newer than OpenGL and b) not DirectX.
I understand it’s good to have a graphics API that isn’t owned by one company and is cross platform. But I get the impression that that’s kind of Vulkan‘s main strong suit. That technically there’s a lot of stuff people aren’t thrilled with, but it has points A and B above so that makes it their preference.
(This is only in regard to how it’s talked about, I’m not suggesting people stop using it or switch off it to thing)
Nothing stops them from providing their own API and Vulkan both. So your arguments only make sense for why they might want other API but they don't make sense on the part reasons for completely denying Vulkan support alongside it. There is no good reason for that and the apparent reason is lock-in.
Geometry shaders have almost always sucked in all fairness. I'm surprised a game newer than 2015 bothered with them. It's been pretty common knowledge that geometry shaders really only work better on Intel hardware (and I'm not sure how long that lasted).
Tessellation falling short is just classic Apple, though. Shows how much they prioritize games in their decision making, despite every other year deciding they need a AAA game to showcase their hardware.
(apologies for the crude answer. I would genuinely be interested in a technical perspective defending the decision. My only conclusion is that the kind of software their customers need, like art or editing, does not need that much tessellation).
Geometry shaders have long been disfavored by all ISVs , not just Apple. It’s just most include the software path.
If you’re using geometry shaders, you’re almost always going to get better performance with compute shaders and indirect draws or mesh shaders.
A lot of hardware vendors will handle them in software which tanks performance. Metal decided to do away with them rather than carry the baggage of something that all vendors agree is bad.
It takes up valuable die space for very little benefit.
Because they don't care. They've decided that Metal is The One True Way to write 3D-accelerated apps on macOS, so they only implement the things in hardware that Metal requires.
There are definitely some features omitted from Apple's GPU, but fairly early in the reverse engineering process, Alyssa Rosenzweig provided several examples of hardware features present in Apple's GPU that are not exposed by Metal: https://rosenzweig.io/blog/asahi-gpu-part-4.html
Maybe, but we got here because I asked "is it possible that Apple doesn't want to support Vulkan (in software) because they don't want to support the features it needs (in hardware)."
If the reason they don't support it in hardware is because they don't want to support it in software, then the logic gets a bit circular.
I'm interested in which came first, or if it's a little of both.
Vulkan very much is designed to give flexibility to hardware vendors. Where abstractions do paper over differences it's generally where it makes the abstraction cheap in runtime but you might take more code vs. less code but requiring a feature that would be otherwise optional (for example some of the complex pipeline manipulation Vs bindless)
Perhaps, but also geometry shaders are generally losing popularity and on their way out. Per google ai search result (for what it is worth):
Geometry shaders are generally considered less necessary in modern graphics pipelines due to the rise of more flexible and efficient alternatives like mesh shaders which can perform similar geometry manipulation tasks with often better performance and more streamlined workflows
We would still like the changes to be upstream but they may need more polish, and it spawned a conversation about the division of responsibilities between SPIRV-Cross and MoltenVK:
Apple not supporting Vulkan is a business decision. They wanted a lean and easy to learn API that they can quickly iterate upon, and they want you to optimize for their hardware. Vulkan does not cater to either of these goals.
Interestingly, Apple was on the list of the initial Vulkan backers — but they pulled out at some point before the first version was released. I suppose they saw the API moving in the direction they were not interested in. So far, their strategy has been a mixed bag. They failed to attract substantial developer interest, at the same time they delivered what I consider to be the best general-purpose GPU API around.
Regarding programmable tessellation, Apple's approach is mesh shaders. As far as I am aware, they are the only platform that offers standard mesh shader functionality across all devices.
That might be an excuse, but that's hardly a reason. They are simply extreme lock-in proponents and don't want to support cross platform graphics API. That's the real reason.
They don't seem to care. I'm sure that will bite them in the long term, but for now they are very intent on complete NIH and lock-in anywhere they can push it.
I’m just blown away at all the work they’re able to do with a platform that they basically reverse engineered. I’m glad to be contributing to their efforts. I’m also waiting for when M3 support comes! Such a cool group of engineers and hackers. I love it.
I played through the entirety of Elden Ring and its DLC on my macbook pro through GPTK which is a pretty modern and demanding game. Don't think that would be possible on asahi.
That’s true, but many titles are 32bit and Apple removed 32 bit support, causing these titles not to run. It’s a real bummer, because there is no technical limitation.
Yes. Haven’t tried those games, but on apple silicon whisky app emulates with gameporting toolkit + wine/proton. For intel silicon I think it was also possible but not sure.
> FTA: "While many games are playable, newer AAA titles don’t hit 60fps yet."
You’re lucky to get 60fps playing a fairly undemanding game on MacOS, even on hardware that is otherwise a dream.
For example, Baldur’s Gate 3 is barely playable on my M3 MacBook Pro at well below native resolution with all settings turned down. It’s a brilliant game but hardly cutting edge graphically.
> most emulated games will need at least 16 Gigs of RAM at minimum
That's because the RAM is shared with the GPU and most of these games would require a GPU with at least 2-4GB on top of the normal system requirement to have at least 8GB. So, 8GB of RAM would be cutting it close on a mac since part of that would have to be sacrificed for the GPU.
Alyssa lays out the reasoning in the linked blog post.
They are running emulated games in their own separate virtual machine, because Intel games expect a 4k page size and the OS is running with a 16k page size.
Virtual Machines require their own chunk if memory overhead, so the resource usage can't help being higher than a native MacOS game's would be.
No, you’ll still get better performance, more features supported and lower overhead running with Game Porting Toolkit currently.
That includes raytracing support and heterogeneous paging support which are two things Alyssa calls out explicitly herself. Not to mention the VM overhead.
That’s not to say Alyssa’s work is not very impressive. It is. But GPTk is still ahead.
That’s not even including the other aspects of Mac support that Asahi still needs to get to. Again, very impressive work, but the answer to your question is No.
i haven't tested it extensively but i tried dark souls 2 on parallels and there were vertex explosions making it unplayable, using crossover and whisky it was a jittery laggy mess. after seeing alyssa's talk i decided to load up asahi and it ran perfectly max resolution 60 fps locked. gaming on macos in my experience has been unplayable to the point where i gave up even trying. after my experience with ds2 i think that it's going to be significantly better.
i never tried to change between dx9 and 11 in parallels or crossover/whisky since i didn't know that was possible, so i was using whatever is default. that said i tried messing with all of the wine settings and it didn't seem to make a difference. i even messed with stuff like esync and msync (or whatever they were).
it works on asahi without issue. that is with x86 to arm translation, directx to vulkan translation, windows to linux translation and 4k page to 16k page translation running in a virtual machine.
as for how well fromsoft games run on windows you might have been right 12 years ago when dark souls 1 came out initially. it was a mess at the time, but souls games have been running just fine on windows(and linux for that matter) for years. it's only on macos that it is a mess. this has nothing to do with fromsoft and everything to do with macos.
> you might have been right 12 years ago when dark souls 1 came out initially
When it came out initially for windows I had already done two playthroughs so just did ... not ... care. I just read it's a crap port in the news.
> souls games have been running just fine on windows(and linux for that matter) for years
Maybe. For like 4 years I ran my PCs without a dedicated video card because crypto and chip crisis. The whole PS5 with an extra controller cost less than an equivalent PC video card at the time :)
[I do have a video card now, but only because someone paid me to write neural network code.]
Nothing prevents you from running games under emulation on MacOS.
Apple and Wine provide the tools, and apps like Whisky make them easy to use.
> Essentially, this app combines multiple translation layers into a single translation tool. It uses Wine, a translation layer that allows Windows apps and games to run on POSIX-based operating systems, like macOS and Linux. It also uses Rosetta and the Game Porting Toolkit, which are two official Apple tools that allow x86 programs to run on Apple Silicon and serve as a framework for porting Windows games to macOS, respectively.
Normally, this sort of process would require users to manually port games to Mac. But by combining Wine, Rosetta, and the Game Porting Toolkit, this can all happen automatically.
You’d need to compare with what can be done on macOS including with things like crossover and the GPT. AFAICT, the Linux side is making progress, but still more games can be run from macOS.
> AFAICT, the Linux side is making progress, but still more games can be run from macOS.
I don't believe that's true. According to ProtonDB, 80% of the top-1000 most-played games on Steam are confirmed working on Linux: https://www.protondb.com/dashboard
I haven't seen any source documenting nearly similar success rates with Mac but I also haven't seriously tried gaming on Apple Silicon.
The Mac efforts rely on MoltenVK for and Vulkan needs which itself relies on the underlying Metal API. As I understand it Asahi/Honeykrisp driver for Vulkan does not rely on the Metal API so it actually can do more conformant Vulkan than Crossover/Whisky can. For example tranform feedback and other geometry shader stuff will work on Asahi. MoltenVK is working on it, but not there yet.
On many games I have tried DX => DXVK => MoltenVk => Metal is significantly faster than DX => D3DMetal. For example XCOM2 is about twice the frames per second (yeah it has an official Mac version but it is even slower).
I will have to check this newest development out, but as someone who dual boots Asahi and MacOS - up until now MacOS with Crossover has definitely been the best experience, if you are willing to pay
The M-series chips from Apple have some special hardware to help emulate x86 with near-native performance, right? I wonder if they take advantage of those features (actually I forget exactly what the features were).
I mean this is an incredible achievement either way. Everything is emulated, but they are still running AAA games. Wow.
Other than the page size issue, FEX and Rosetta are comparable technologies (both are emulators, despite what Apple marketing might have you believe). Both FEX and Rosetta use the unique Apple Silicon CPU feature that is most important for x86/x86_64 emulation performance: TSO mode. Thanks to this feature, FEX can offer fast and accurate x86/x86_64 emulation on Apple Silicon systems.
If every layer of abstraction and emulation is set up to allow it to pass through. It still seems really impressive to me, like lining up a bunch of targets and shooting an arrow through to get multiple bullseyes or something, haha.
Maybe if you happen to own the control stack from hardware, OS, drivers, to client software and are willing to never change any of it once you get things lined up.
I think the comment was saying “there isn’t any reason they can’t.” Which is true in theory, but in the practice it seems to be a lot of stuff to line up.
Fantastic! A great proof of concept on Linux - lots of AAA gaming is already possible on Mac with Crossover and/or Parallels or VMWare Personal, which is free! While I have a Steam Deck, gaming on Mac works for me - I refuse to play Baldurs Gate 3 on a controller.
I know it's an extremely un-Apple-like thing to do, but I really wish Apple would team up with Valve to work on Proton, and bring full Proton support to MacOS.
Bringing Proton to Mac would involve either Apple making amends with Khronos and supporting Vulkan, or Valve making the substantial effort to port Proton to Metal natively, or doing DirectX-to-Vulkan-to-Metal translation with MoltenVK. None of those sound very likely or optimal to me.
Besides, the main reason Valve is investing so heavily in Linux and Proton is so their destiny isn't tied to someone else's platform. MacOS is just another someone else's platform like Windows is, with the same threat of getting rug-pulled by a first-party app store that spooked Gabe Newell[1] into investing in Linux in the first place.
Apple already provides their Game Porting Toolkit which includes a D3D12 to Metal translation later for Wine, and it has been integrated into user-friendly Wine distributions like Crossover since last year. There's not much Proton has to offer over what's already available.
My understanding about the game porting toolkit is that it requires developers to specifically modify their game in order to make their game compatible.
The magic of Proton from a consumer point of view is that it just works for basically every game, sans those with Kernel-level anticheat stuff. This means thousands of old games that haven't been updated in years will work.any games that don't have active developers.
So Apples solution works for new games but isn't a practical option for compatibility for existing games.
The stated intended purpose of the game porting toolkit is to enable developers to modify their games. But the software actually being shipped includes what is literally a Wine GPU backend, which is usable by (and already used and bundled by) consumer-facing Wine applications like Crossover. If you go to Codeweavers, download any Crossover for Mac from the past year (Sep. 27, 2023 according to their release notes), you're getting a tool that includes the D3D to Metal layer from Apple's Game Porting Toolkit.
Also note that Codeweavers, Crossover's developer, is a major contributor to both Wine and Proton, so there's a great deal of, um, crossover between these projects.
Don’t forget Apple’s GamePortingToolkit based on Crossover/Wine and the open source client for it Whisky. I think it supports most games Linux Proton does now.
BG3 is the only RPG I would play on a controller, it's very well done. You can also connect a keyboard and monitor to the Steam Deck, BG3 runs at 1080p high locked to 30FPS
>While I have a Steam Deck, gaming on Mac works for me - I refuse to play Baldurs Gate 3 on a controller.
Personally 99% of my Steam Deck usage is with it docked. I do mostly use a controller, but also have it hooked to the same USB switch as my PC so I can hit a button to move my keyboard and mouse over.
Baldur's Gate 3 is the first game I ever ran on my Deck that did not run very well, though. Most stuff I've played runs at 60fps at my external monitor's 1920x1200 resolution. That in addition to not liking the gameplay on BG3 much made me not continue with the game, though I may revisit it someday.
I'm slightly confused after reading about page alignment. Why would a 16k page be less aligned than a 4k page causing assumptions about pointers within those pages to break? The 4k pages on x86 are aligned on 4k boundaries, are the 16k pages on M1 aligned on <4k boundaries?
I'm missing something here. Assuming there are pages at 0k, 16k, 32k etc - all of those pages are aligned on 4k boundaries as 4k > 16k. So code written with the assumption that its pages are 4k aligned should have that assumption met when running with 16k pages. It is still early here and I have only had one cup of coffee. Am I misunderstanding something really obvious?
Depends what kind of computer you have and what you want to do with it. M3 does not work at all. M1 is the best supported but even there some important things like microphones and thunderbolt still don’t work.
If you expect to be able to install it on any random Mac computer without thinking and have everything work, then no, it's not there yet. But I think just saying the answer to the OP's question is "no" would have been an oversimplification.
From what I understand, one of the factors to not focus on Asahi Linux on M3 for now is the lack of an M3 Mac Mini which supposedly makes the development easier.
The M3 GPU added a bunch of features including ray tracing. The "dynamic caching" sounds like a big change to local memory which could require serious driver changes.
M3 GPU uses a new instruction encoding, among other things. Also, it has a new memory partitioning scheme (aka. Dynamic Caching), which probably requires a bunch of changes to both the driver interface and the shader compiler. I hope the Asahi team will get to publishing the details of M3 soon, I have been curious about this for a while.
I don't know how different but it apparently has dramatically improved hardware shaders compared to earlier M chips so I'm guessing that a lot of this might be different, there.
For people like me who have been using Ashai for a while but are not Fedora natives; TIL `sudo dnf system-upgrade download --releasever=40; sudo dnf system-upgrade reboot` is necessary first as the normal upgrades left me on 39.
I think at the moment, this is probably the only way to feasibly game on a Mac. Crossover and other Wine-based apps as well as Parallels are... not really truly possible. If you bought the top-of-the-line MacBook Pro 16-inch, 2021 with M1 Max and tried to play anything reasonably modern on it, you'd find the performance is basically not playable .
What's the status of the game porting kit? The last time I used it to get a game running, it proved to be a bit of a miserable slog (to be fair, I was trying to run UnityModManager for a CRPG).
I'm a little sad that this has seemingly taken precedence over all other hardware support. M3 support, dp-alt mode, making the microphone work are all things that I was hoping were going to land in the past year.
I understand the sentiment. But the people who could work on the Asahi Linux graphics stack are generally not the same as the people who could e.g. bring up Asahi Linux on M3 chips.
I would not consider the lack of activity in some Asahi Linux areas to be a conflict of priorities. It is in my opinion mostly a result of these lacking areas naturally attracting less developers capable of moving them forward.
The M3 GPU is a lot different and has a bunch of new features like ray tracing, so the super talented team working on the Asahi Linux graphics stack might have a lot of work ahead of them to support the M3's GPU fully as well.
God I wish I was smart enough to help out with Asahi Linux...
It's an Apple chip with no documentation and zero existent driver code to reference. You have to set realistic expectations here, and acknowledge that not every contributor is going to have the domain-specific knowledge required to make everything work. It's nothing short of a divine miracle that it has working Vulkan drivers you can download within a half-decade of it's release.
If you want more, you'll have to take it up with Tim Cook or God (both have a nasty habit of ignoring us little guys). Also an option: not using a laptop that treats Linux as a threat to it's business model.
Alyssa Rosenzweig already talked a bit about that on her Mastodon. She said that after having worked to implement a GPU drivers, it was annoying that she never had the time to quite finish them. On each device release, she had to support the new device instead of polishing what she got.
I'm aware of no better way to see your desired features land in open source than to build them yourself. That is the power of open source, nobody can stop you!
i don't hate to be the one to tell you, but skills and context can be learnt. personally, i have found no better way to learn skills than to work on something i care about.
I do hope that LLMs learn more from the Asahi Linux team's code and their amazing blog posts, in order to provide better guidance for new systems programmers.
I guarantee you'll get much further than you would have previously done in the same amount of time, just by virtue of it being able to point you in the right direction. You don't need perfection when learning, you need a wayfinder, and it can do that just fine.
It won’t point you in the right direction though. At least in my experience. It will only give very superficial answers. And fe just trying to write rust - it will try to explain the error message but most of the time says nothing new and to find out how to fix it you will have to read docs and understand things the old way anyway. At least in my experience
AFAIK the M3 is going to take a lot longer as the asahi team leverages apple silicon in their CI which means mac mini servers and the M3 generation never got their mac mini. Of all the generations to finally take the plunge into apple silicon, I had to choose the weird one... (typing this on an M3 mbair and not on linux sigh)
I mean this is the nature of the beast with arm and apple. It’s a closed system. There are some devs that are going to be willing to go through the effort just for the challenge of it, but most are just going to use x86/linux because you don’t have to actively fight against the vendor.
not ipads, those are locked down. Macs though, they have an open bootloader and so far M1 and M2 chips have pretty good linux support, fully reverse-engineered:
https://asahilinux.org/fedora/#device-support
For the kind of person who wants to run NixOS on Apple Silicon or do Linux gaming on Apple Silicon in the first place, that's probably interesting and not too hard
but if you're allergic to that, you might be able to figure something else out with Box64, which is already packaged in Nixpkgs
x86_64 gaming on NixOS is of course well supported and has been for a long time. There's a 'native' package that I've always used and the Steam Flatpak is also available and works as well as it does anywhere
we are talking about asahi linux. i think it is pretty clear that nixos isn’t supported like a first class citizen because you have to do a fair bit of work to make all of the more recent userspace fixes work on NixOS. i run NixOS Asahi so I know.
it was easier when Arch was a first class citizen but the advice nowadays you get upon encountering a problem on Arch is to switch to Fedora
In heaven, Microsoft is in charge of gaming, Amazon does the customer service, Apple is responsible for privacy, Facebook does the UI, and everyone works at Google.
In hell, Apple is in charge of gaming, Google does the customer service, Facebook is responsible for privacy, Microsoft does the UI, and everyone works at Amazon.
Can't you famously just return stuff to Amazon within a month for basically any reason? I purchased a monitor from Dell and after a few weeks it became clear that there was a loose connection internally. It was extremely simple to prove. But getting it replaced was hell. I went through the whole process of creating a ticket, talking to 3 different people, taking photos of the thing from every angle and then after they failed to get back to me for a week, I emailed for an update and was told "Sorry sir, there was no activity on the ticket for over a week so the ticket was closed". It didn't matter that the delay was on their side. And no they wouldn't reopen the ticket, and no I couldn't refer to the old ticket, and no the old photos wouldn't work. Start over. Talk to several different support people again, take all those photos again.
My SIL bought a scanner from Amazon a few months back and never unboxed it because she was moving house. When she did, it was faulty. They took it back without much of a fight even though the month was up. She just said "I unboxed it yesterday, it's broken".
Nobody ever went broke banking on the laziness of the American public. People could also go to the pizza shop or liquor store or grocery store or 7-11 or whatever instead of paying a shitload more delivery but they don't.
I bought Kindle Paperwhite with ads. Get tired of ads. Tried to pay Amazon to remove ads, for some reason it didn't work (I'm not from USA).
Contacted customer support, explained what's the problem, the person on the other side said "wait a minute, sir" and removed ads from my Kindle without asking me to pay for it.
Amazon's customer service (for the web store at least) is fantastic.
Even if the core shopping/delivery service fails you, if you complain, they'll take the "customer is always right" position and make you whole. They'll refund or re-ship with no questions asked, without requiring you sending back anything or even so much as providing proof.
I'm sure some people must take advantage of that level of customer service, but it's a really pleasant experience.
Be cautious about calling their customer support if you have "bought" DRM stuff: you can be banned for any reason at any time.
I complained about a failed delivery (broken box, one item missing). They refunded me but then immediately put me on a watch-list, threatening to ban me if I ever complain again. I will never buy anymore on amazon.
Based on my own experience it is still fantastic. Not long ago I had an issue with an order and a real human called me back right away outside normal office hours. No waiting in a phone queue. And he was actually able to help me. If that is not excellent I don't know what is. I don't know of any other large company with a remotely similar customer experience.
This is good but the punchline needs to be punchier.
I do like the implication that were working in the warehouse and not AWS but maybe it's too subtle.
You might also be able to do something with the surprise switch from Linux to Linus. In heaven code is reviewed on GitHub [...], in hell [...] a
nd your code is reviewed by Linus.
I've heard that Amazon is a miserable place to work even if you're a software engineer.
The switcheroo idea sounds good.
My joke got more attention than I expected, given that it was just a quick first draft. I encourage everyone to improve on it, and share your version wherever you want, without attribution. Consider it a part of the public domain, just like the joke Carlin told.
I noticed the URL was updated for this post. Previously it linked to asahilinux.org which showed an anti-HN manifesto from the HN referral. Curious as I haven’t seen this before. Seems it has been covered by previous commenters: https://news.ycombinator.com/item?id=36227103
See[1] the Referrer-Policy header, <meta name="referrer">, <a referrerpolicy> and <a rel="noreferrer">.
But generally, webmasters have found it useful to know who caused their server to fall over^W^W^W^W^W^W is linking to their pages. This was even used as a predecessor to pingbacks once upon a time, but turned out to be too spammable (yes, even more so than pingbacks).
After the HN operators started adding rel=noreferrer to links to the Asahi Linux website, Marcan responded[2] by excluding anyone who has the HN submit form in their browser history, which feels like a legitimate attack on the browser’s security model—I don’t know how it’d be possible to do that. (Cross-origin isolation is supposed to prevent cross-site tracking of this exact kind, and concerns about such privacy violations are why SRI has not been turned into a caching mechanism along the lines of Want-Content-Digest, and so on and so forth.) ETA: This is no longer in place, it seems.
Visited links have always looked different from unvisited ones, and the moment you could customize how links looked via CSS, browsers also had to implement styling for visited links specifically.
Modern browsers put a lot of care into making the changes to those styles observable to the user, but not to Javascript.
This is an extremely hard problem, and browsers have had a lot of security issues related to this behavior. Nowadays, you can only apply a very limited subset of CSS properties to those styles, to avoid side-channel timing attacks and such.
This means you can display a banner to anybody who has a certain URL in their browser history, but you can't observe whether that banner actually shows up with JS or transmit that information to your server.
> This means you can display a banner to anybody who has a certain URL in their browser history, but you can't observe whether that banner actually shows up with JS or transmit that information to your server.
How do they stop you from using Canvas to see the output and send it back?
Canvas can't "see the output", it only sees what is drawn in it (which is not a set of HTML tags, it's JS commands).
The screen recording/screen sharing API can be used for this but security is the reason you have to give explicit permission to the site before it can do this.
IIRC, Firefox had a bug where this exact scenario was possible, I think you needed to embed the link in html embedded inside an SVG, which was displayed in the canvas, and then access the bitmap. You could e.g. make the link black if visited and white otherwise, and then the number of white versus black pixels in the bitmap would tell you whether the link was visited or not.
There was also that asteroids game / captcha where links were white/black squares and your goal was to click all the black ones. Of course, clicking a square revealed that you knew the square was black, which meant the URL under it was in your history.
If you go back far enough there weren't even protections against this sort of thing at all! E.g. you could just say a visited link style was 1px taller then measure that. The protections had to be added in after the fact (often with special case logic for what's allowed to be styled or read on :visited) once security became a major concern!
Referer does have legitimate uses. For example, back in the day people would use it to detect if someone embedded an image from their site on another site. SomethingAwful famously used to respond to any such requests with goatse, and forums I was on had very strict "don't link to SA images" rules as a result.
I think that using referer to try to deliver manifestos to users of another site is kinda childish, but so it goes. Every tool can be put to good or bad uses.
The Referrer-Policy header lets a server tell the browser how much referrer information to pass on when following links, all the way down to nothing at all if desired. Chrome does respect that, and they also followed other browsers in changing the default to "strict-origin-when-cross-origin" a few years ago which truncates the referrer path when leaving to a different domain, so they only see the domain the visitor came from rather than the specific page like they used to. Can't really fault Google in this case.
There's a handy addon for Firefox called Privacy Settings that can take care of that. Explicitly adds and option to have the referers be not sent, and a quick way of re-enabling it, in case it breaks a website. Because of course that happens too.
Alyssa Rosenzweig
Asahi Lina
chaos_princess
Davide Cavalca
Dougall Johnson
Ella Stanforth
Faith Ekstrand
Janne Grunau
Karol Herbst
marcan
Mary Guillemard
Neal Gompa
Sergio López
TellowKrinkle
Teoh Han Hui
Rob Clark
Ryan Houdek
Yes, a lot, it's basically confirmed. Last time someone linked proof, it got flagged immediately. Kiwifarms is a dumpster fire, so I'm not going to search or link anything.
and then acts offended or claims doxxing (and starts using it to stir shit up for leverage) when people draw the obvious conclusion, that's behavior in bad faith and should be called out as such and dismissed.
Yeah yeah great, now please m3 support, or maybe before that support for internal mic and external displays/dp-alt. Pretty please?
(Not complaining happy about any progress)
> Asahi means “rising sun” in Japanese, and it is also the name of an apple cultivar. 旭りんご (asahi ringo) is what we know as the McIntosh Apple, the apple variety that gave the Mac its name.
I think there is something wrong with them because they are failing to see that the c++ syntax has an abysmal complexity which propagates the toolchain implementation.
This is the only reason you need to make this language a definitive nono.
They should be ashamed of themselves, and if they are not, well, they are toxic, and they are doing it at worldwide scale.
It is shocking the effort required to have a good gaming experience on Apple computers (excluding iOS). They always struck me as agnostic to games, yet in recent years it appears to border on open hostility.
Apple has always been anti-gaming. It's been in the companies DNA since the first Mac was derided as a "toy" for having a graphical user interface, and they overcompensated trying to make it a business machine with no games.
About once a decade someone inside of Apple who is really passionate about games pushes some project through - you had GameSprockets in the 90's, you had someone convincing Valve to port Half-Life, you have GamePortingKit now, but it's just not in the companies culture to give game developers the long-term support they need.
Ok, but why would a hardcore Linux person want to play games that embody everything they hate about Windows in their mode of production, data gathering practices, politics, etc?
People use Linux for a wide variety of reasons and those reasons are very often not ideological. If the only reason to use Linux was ideological, Linux wouldn't be as popular as it is.
Also, there are plenty of Windows-only games that aren't subject to those practices. Free games, itch.io games, GOG games, etc. There's a big world out there!
> there are plenty of Windows-only games that aren't subject to those practices. Free games, itch.io games, GOG games, etc. There's a big world out there!
Those games are generally not AAA by definition, and often either already have a Linux build released, run acceptably under traditional emulation, or both.
It will never happen, but my dream is for the Asahi devs, Valve, and Apple to all get together to build out a cross-platform Proton to emulate and play games built for Windows on both x86 and ARM hardware running Linux.
A Steam Deck with the performance and power efficiency of an M-series ARM chip and the entire library of games that run on Proton is just...dreamy.