Hacker News new | past | comments | ask | show | jobs | submit login
Paving the Road to Vulkan on Asahi Linux (asahilinux.org)
722 points by jiripospisil on March 20, 2023 | hide | past | favorite | 273 comments



> Now we can run Xonotic at over 800 FPS, which is faster than macOS on the same hardware (M2 MacBook Air) at around 600*! This proves that open source reverse engineered GPU drivers really have the power to beat Apple’s drivers in real-world scenarios!

> Not only that, our driver passes 100% of the dEQP-GLES2 and dEQP-EGL conformance tests, which is better OpenGL conformance than macOS for that version. But we’re not stopping there of course, with full GLES 3.0 and 3.1 support well underway thanks to Alyssa’s tireless efforts!

That's very impressive work. Congrats to Asahi and Alyssa.


Now that we can see Vulkan is around 4/3 more power efficient than Metal on the same hardware, I wonder if Apple will budge on their not-invented-here syndrome and allow for a real vulkan.kext on upcoming macOS versions. That would solidify it as not only the best graphical workstation OS, but also the best gaming OS, in my opinion. Part of me thinks they're avoiding this in worry of stepping on Microsoft's toes.


They just won't do it.

Not that they can't, it would be nice. When they capitulated and came out with the intel macs around 2007ish, it became sort of a golden era. You could run macos and windows on their machines, and the smart computer folks started getting and usefully using macs.

But now apple is years into a disappointing inward looking phase.

What a decent future would be like...

  - vulcan/x
    - take back xquartz into the fold
    - actively pursue other linux graphics apis
  - apple does virtualization
    - maybe like rosetta but for all their software
    - run macos9 in a window
  - apple does containers
    - what if there was something like docker for macos natively?
      FROM macos:10.3
  - apple actively supports open source projects
  - open up ios
    - let me see the filesystem
    - let me run a firewall (true privacy?)
etc...


If you go work for apple and use development hardware, you can have many of these things...

But you lose the ability to talk about it.


If Apple did all that, I would buy new hardware from them, and even leave their OS in place on it, for the first time in almost two decades.


The move to MacOS by development and sysadmin types started long before Intel Macs with the release of OSX, when it was finally possible to get a laptop with a 5+ hour battery life that ran a Unix-like OS.

Edit: The latest macOS has a great virtualization system built in, that many of the container platforms have already switched to.


The PowerPC machines never did reach actual 5 hour battery time, 3 at most, on a good day, with the display off.


I think my "Lombard" or maybe "Pismo" powerpc powerbook did longer than 3 by far. And my iBook would go 8 hours no sweat.

edit: fixed the nickname


Mine managed 6-8 without too much problem throughout the time I was in university. And they had removable batteries so you could just switch them out if you ever needed to (I had two for my Powerbook, and that would do a full day on campus without ever having to go near a power outlet).


G4 ibooks were crazy good for battery life, I often did a whole day of uni on my little white beast.


Not that I disagree with the spirit, X is dead, and someone demo'd Wayland apps on macOS at NixCon Paris.


From the article:

* Please don’t take the exact number too seriously, as there are other differences too (Xonotic runs under Rosetta on macOS, but it was also rendering at a lower resolution there due to being a non-Retina app).


Not the least of which is likely power and clock state management (both on the CPU and GPU)


> Now that we can see Vulkan is around 4/3 more power efficient than Metal on the same hardware

Nonsense.

They haven’t even implemented power management and it’s not even a fair comparison.


About power management:

https://www.reddit.com/r/AsahiLinux/comments/11mjhx5/tips_fo...

Marcan (two weeks ago):

  It's a known issue blocked on bureaucracy/politics with the kernel, so you just have to wait. Sorry.

  ARM64 Linux does not want hardware-specific cpuidle drivers, instead relying on the PSCI firmware standard, but that standard is designed to rely on optional ARM features that Apple Silicon does not implement, so we can't implement it. Basically, PSCI assumes you have a "super-hypervisor" running the platform (TrustZone/EL3 on most Androids, but also somewhat similar to SMM mode on Intel), which is something Apple very deliberately stripped out of their CPUs very quickly because they don't need it and their security model relies on better solutions. Linux on AS truly runs on bare metal at the highest privilege, unlike the vast majority of other ARM64 platforms.

  The current thinking is we need to make a new standard, either a new PSCI transport that does work on AS or something simpler/ad-hoc. We haven't had time to start that long conversation formally yet. I could write a hardware-specific cpuidle driver in a day, but of course they'd reject it upstream.

Doesn't seem like this is going to be fixed anytime soon. :(


It's not using Vulkan. It's using opengl.


Native Vulkan support could also pave the way for a proper Mac version of DXVK. That could make Macs a lot more viable as gaming machines than they are now, even with the Rosetta 2 overhead.


As much as I like the idea of Linux being in a special position w/r/t emulation layers for Windows games, this makes a lot of sense to me as something that would be desirable to Valve, and might, like the Steam Deck, help bootstrap Proton+DXVK as a worthy target platform for game developers/publishers.


Valve supports Steam Deck/Proton that much because they are afraid of being killed by windows locking itself down, it's a live line for them, but not a supper profitable business.

A Steam for iOS & Android might be worth it for Valve, but for Mac it's like not worth the money it costs.

Additionally Apple is one of the main offenders pushing for a lockdown of PC platforms and one of the main lobbyist trying to prevent regulation requiring the possibility of (well working) 3rd party app stores. Which means not only would good Mac support be costly it also is constantly at risk of "defacto" being killed off. (Yes there are currently regulations for requiring 3rd party app stores, but it being theoretical possible and it being practically viable for a business are not the same.)


I'm not sure the original proposition was that Valve focus on making that happen:

> > I wonder if Apple will budge on their not-invented-here syndrome and allow for a real vulkan.kext on upcoming macOS versions

> Native Vulkan support could also pave the way for a proper Mac version of DXVK

I think the proposition was that if Valve lets it happen, other interested developers might be inclined to improve the situation enough that it's not so much extra work for Valve to add Mac support to Proton.

I'm not sure who all would be interested, but I imagine the motivation would be similar to whatever is keeping the MoltenVK developers going.

You're right that Apple provides an a hostile and precarious platform compared to Linux, of course.


> I wonder if Apple will budge on their not-invented-here syndrome and allow for a real vulkan

Do you realize that Vulkan as an idea (not even as a spec) appeared 1 year after Metal was shipped?

Do you realize that apart from Android there's no platform that truly supports Vulkan?

> That would solidify it as not only the best graphical workstation OS, but also the best gaming OS

Of course it wouldn't. Vulkan is on its way to become the next OpenGL


Vulkan is about 4/3 as power efficient when rendering at a lower resolution than Metal

Is that surprising? Fewer pixels usually means faster rendering.


Metal was rendering in a lower resolution, but metal the game was also running through rosetta (x86 emulation) on Mac which is fast but has some overhead.

_Hence why she claims that it is in a similar performance range._

(instead of saying it's way fast, which is a result from taking sentences out of context and not fully reading the article)


Metal was rendering at lower resolution than Vulkan here.


OpenGL rather.


It’s not 4/3 as power efficient at all, that comment was based on a misreading of the stats given.


Xonotic runs under Rosetta on macOS.


>Part of me thinks they're avoiding this in worry of stepping on Microsoft's toes.

What


I wonder if the driver could be ported to Mac OS?


As far as I know the backend to have the driver running on macos is already merged in mesa, so it should already work as-is?


So ARM Macs in OSX can now or soon have true Vulkan support with just a custom userspace app code targetting the kernel interface directly?


No. As the article says, the macOS kernel interface is an undocumented, moving target.


I don't think the vulkan driver exists in a usable state yet. So far there's only an OpenGL driver.


Maybe, but it seems bizarre that open source developers go so far to contribute to a closed source proprietary ecosystem when the manufacturer doesn't only make it difficult but they at times actually intentionally impede their work. That is a lot of time and effort of someone doing something for free that the manufacturer should be paying them to do and assist with.


I’m pretty sure the asahi developers themselves would totally disagree with you there. Apple themselves have confirmed with them they have no plans to lock their boot loader out from folks like them. This project is no different than Linux in the old days: it’s just a piece of good hardware that kernel devs have reversed to run an alternate OS on and they’ve become quite good at it.

I don’t see people making the same statements about work on the nouveau driver or on the Broadcom opensource wifi drivers. But somehow because the hardware was built by apple folks seem to think it’s more proprietary than anything else linux has run on.


> Apple themselves have confirmed with them they have no plans to lock their boot loader out from folks like them.

Didn't they make an undocumented change to the boot loader that serves literally no purpose other than to give Asahi a somewhat stable target than what they were using before?


Yeah something like that. While I’m sure they don’t like folks reversing their hardware I doubt they care to stop this if they know there’s nothing that can be done to prevent it. I think it’s more likely they become consumers of asahi Linux’ work than it is for them to actively take measures to break them.


> Apple themselves have confirmed with them they have no plans to lock their boot loader out from folks like them

No, a person who used to work on the bootloader at Apple said on Twitter they did it because they wanted to enable different OSes. That's not an "Apple confirmed", that's "employee X said".


That’s tacit apple approval. Apples PR department would probably have an employee fired for stating something like that without prior approval.


Regardless of all the points the siblings make, first and foremost this is fun. It's fun to reverse engineer stuff. It's fun to get things working that were previously not working.

A lot of peoples careers start this way. A lot of the hacking and the cracking scene was born this way. There is a certain kind of pleasure and satisfaction involved when you get a device that was previously not designed to do a certain thing behave in that way.


I think that's not an unreasonable point, but, well:

1. I try not to tell people what to do with their free time. While you may think it's "bizarre", this use of their time has value to them, not only in the hopeful end result (fully-functional Linux on ARM Macs), but also in the satisfaction of the technical challenge, bragging rights, and general reputation. I'm sure there are some people who might look at what you do with your spare time and think you're "wasting" it sometimes. But that's in the eye of the beholder, and at any rate, that's your prerogative, as this is theirs.

2. I used to run Linux on Mac laptops (gave up around 2016 or so, tried again in 2018, gave up again shortly after), and I get the appeal: the hardware is really nice. And by all accounts, the ARM Macs are even nicer than the Intel Macs. Sure, they're not perfect (lack of upgradeability/repairability, etc.), but running Linux on them can be great, if the hardware support is there. "I like this hardware and I want to run Linux on/with it, so I'll figure it out myself" seems like a perfectly reasonable thing to do. Many of the drivers in the Linux kernel for various bits of hardware only exist because someone adopted this attitude.

I also just think your premise is a bit flawed:

> open source developers go so far to contribute to a closed source proprietary ecosystem

This is a little bit of a weird statement, because these developers aren't doing that. The "closed source proprietary ecosystem" is macOS and its app store. The hardware itself is more or less just as open (or closed) as most non-Apple hardware. I mean, I can't rewrite the BIOS in my Framework Laptop, nor can I make heads or tails of any of the binary firmware blobs Linux loads into the WiFi chipset, graphics chipset, etc. Apple's hardware is undocumented, certainly, but that's pretty common when it comes to Linux hardware support.

> but they at times actually intentionally impede their work

Do they, though? From what I've read of the Asahi project's progress, they didn't run into cases where Apple intentionally tried to make things harder on them. Sure, some things were harder, but I don't think we can ascribe a malicious motive to Apple. The most likely explanation is that they just decided to design things in a particular way because they felt it would be best for their own purposes, and didn't really care to think about anything else.

They could have decided to actively cryptographically lock down the boot process to prohibit other OSes from running, but they didn't do that.

> That is a lot of time and effort of someone doing something for free that the manufacturer should be paying them to do and assist with.

Why "should" they? All hardware manufacturers decide what software to write, and what platforms to support. If they don't think the cost of writing and supporting drivers for Linux is worth what they'll get in return, they'll make the logical choice to just... not do that. We've seen plenty of vendors over the 30-odd-year lifetime of Linux do that math and decide Linux support wasn't worth it to them. It's a shame, but I don't think it's fair to come down on them hard for that. Certainly some vendors (nvidia comes to mind) have been actively hostile toward the Linux community at times, but I don't think we can say the same of Apple.


In fact Asahi team have argued before that Apple has actually gone to great lengths to provide documentation and tools to enable the development of third party OS kernels.


MacOS has been put on the back burner for Apple. It's all iPhone and iOS. It's clear that these scrappy hackers are going to fully utilize the hardware, clearly Apple doesn't care.

Off-topic: FreeBSD had an early version of Grand Central Dispatch ported to it, I wonder if Linux could benefit from that maybe with some io-uring goodness.


That ... really doesn't seem like the case. The macOS world has seen soo many big changes recently, with the complete visual overhaul of everything with Big Sur and then Ventura (whether you like that or not), the rewriting of a whole lot of applications and system stuff from AppKit to SwiftUI (again, whether you like that or not), porting the entire operating system to ARM, inventing an excellent x86_64 virtualization layer, overhauling everything related to booting and system upgrades (even if it's just taking what's already built for iOS and heavily modifying it to work for Macs), new window management features, etc etc. That's a lot of engineering resources invested.

The sorry state of Apple's OpenGL drivers is a part of their strategy. They're choosing to not invest in OpenGL, and have even officially deprecated it. They want you to use Metal instead.


Which is so bizarre to me. Metal isn’t cross platform and 3d is very much a cross platform technology. So then why footgun your OS for 3d?


> Metal isn’t cross platform and 3d is very much a cross platform technology.

It isn't.

Vulkan as an idea appeared a year after Metal was released.

And even today:

- Windows first-party support is DirectX

- Xbox is DirectX

- Playstation is whatever Playstation uses

- Switch has nominal Vulkan support, but you're better off using their non-Vulkan SDKs

- MacOS and iOS are Metal

- Android support Vulkan. Some versions of it, depends on phone, version, and manufacturer.

- WebGL is browser-only... and on its way to be replaced by WebGPU

Yeah. Great cross-platform story


I’m not sure what you’re on about. Vulakn’s story is about translating DX12 calls to Vulkan. That’s how it works on Linux and Linux gaming is so amazing because of it. I see Vulkan as the replacement for OpenGL and am excited for its future.

MacOS is a joke for gaming. iOS though of course is all metal. If you wanna target that platform then that’s all you got. But translating or porting to other platforms DX12 on Windows Vulkan is the answer.


It would make perfect sense if macOS was already the go-to platform for 3D apps and games. Third-party developers would then have to port to Metal in order to keep their customer base, and their OpenGL backend would be abandoned or at least a second-class citizen. That could be a competitive advantage for Apple.

But macOS is not the go-to platform for this sort of thing, especially when it comes to games. This does seem like a strategic mistake. Maybe they just don't care, for whatever reason. I mean, it doesn't seem like Metal has been catastrophic for Apple, has it?

Then again, OpenGL is probably still good enough on macOS for most 3D apps, deprecated or not. And if Apple does decide to actually remove it, I believe there are projects that implement the OpenGL API on top of Metal (certainly with performance impact). And third-party developers certainly have the option to wait to write a Metal backend only after Apple announces OpenGL is being dropped completely.


Yeah, there's ANGLE which implements OpenGL ES on top of Metal, and then there's MoltenVK which implements Vulkan on top of Metal, so you can get a program which uses the cross-platform APIs to run on macOS.

Maybe that's the whole strategy? To provide an API which exposes what's easy to do in a well-performing way on their hardware, and let the community worry about getting the standards to work? It certainly seems to be working, and it probably means they need to spend less resources on driver development than if they tried to stay current on OpenGL and Vulkan.


The same mistake done by game consoles?

All major AAA engines support Metal, even if the main target for them is iOS.


There are some historic reason why we end up with Apple, but additionally:

OpenGL is old, crufty and for modern standards sucks no matter how grate it once was. Most new 3D software doesn't use OpenGL, a lot of new software which does use OpenGL limits itself to OpenGL ES (which is roughly a subset of OpenGL).

Apple always had issues with a "Not Invented Here" syndrome and an obsession to control every little bit. Apple always pushed that software should be explicitly written for Apple, or if not at least with an Apple focus. New GUIs "should" be written with SwiftUI, etc.

For apple having a different API means they don't have spent time with standards outside of things they want to pus beyond their ecosystem, this means they can just further develop Metal however they want, ignoring inputs from everyone else. This can remove friction, but also will remove valuable feedback and profiting from other peoples innovation.

Lastly Vulcan, DirectX12 and Metal aren't too different hence why thinks like MoltenVK (Vulkan API/emulation on top of Metal) or VKD3D (DirectX12 API/emulation on top of Vulcan) exists. (Through no surprise all three have their own edge cases which does make such cross libraries non trivial, still they are quite viable and most important performant as they don't have to do much emulation and mostly can just map functionality.)


Metal is widely used for game development due to iOS. This means there are plenty of game developers familiar with it, and lots of tools such as porting tools and game engines like Unity that support it very well. The main problem has been that Metal did not support a lot of advanced features used by AAA desktop games. This changed last year with a huge update to the high end rendering features in desktop Metal, so Apple does seem serious about this. They’ve always been weak on game tech on the desktop though, so we’ll see.


Just like all game consoles have done?


Are you basing this on something or just personal opinion?

They literally just ported the whole platform to ARM which has had some great benefits for the whole OS + UX of using laptops.


Even if you don't care a lot about Apple, this is still a great read.

If you're a layman it can be hard to find information on how graphics works that is technical enough (uses terms like "user space" and "kernel"), but simple and high-level enough for somebody who doesn't know much. There is stuff like that throughout the piece.

Here's the first example:

> In every modern OS, GPU drivers are split into two parts: a userspace part, and a kernel part. The kernel part is in charge of managing GPU resources and how they are shared between apps, and the userspace part is in charge of converting commands from a graphics API (such as OpenGL or Vulkan) into the hardware commands that the GPU needs to execute.

> Between those two parts, there is something called the Userspace API or “UAPI”. This is the interface that they use to communicate between them, and it is specific to each class of GPUs! Since the exact split between userspace and the kernel can vary depending on how each GPU is designed, and since different GPU designs require different bits of data and parameters to be passed between userspace and the kernel, each new GPU driver requires its own UAPI to go along with it.


On a tangent from that quote, I'm curious how much extra perf we could squeeze from GPUs if the applications driving them were running in kernel mode (picture an oldschool boot-from-floppy game, but in the modern day as a unikernel), and therefore the "GPU driver" was just a straight kernel API that didn't need any context switching or serialized userspace/kernelspace protocol, but could rely on directly building kernel-trustable data structures and handing them off to be rendered.

Presumably there was an era of console games that did things this way, back before game consoles had OSes — but since that would be about 10–15 years ago now (the Gamecube + PS2 era) it'd be somewhat hard to judge from that what the perf margin for modern devices would be, since the modern rendering pipeline is so different than back then.


> On a tangent from that quote, I'm curious how much extra perf we could squeeze from GPUs if the applications driving them were running in kernel mode (picture an oldschool boot-from-floppy game, but in the modern day as a unikernel)

Presumably you're quite seasoned so I would assume you'd know: but Windows itself put a lot of its graphics rendering in kernel-space, they saw considerable performance gains from doing so but suffered 2 decades of severe bugs in rendering.


It would probably be more practical to map the GPU into userspace, than to put the application in kernel space.


I don't know about it being practical to map the GPU into userspace. Most systems only have the one GPU, and it being handed off to be managed by userspace (presumably via IOMMU allocation, like happens when you "pass through" a GPU to a particular virtual machine) means the kernel now can't use it. So the game can draw to the screen, but the OS can't. Which sucks for startup/shutdown, and any time things go wrong.

And yeah, if you imagine having to write a game as an OS kernel driver, then yeah, that's probably impractical. You don't want to have to program a game using kernel APIs.

But imagine instead, taking a regular OS with protected memory, and:

• Stripping out the kernel logic during context switches / interrupts, that de-elevates userland processes down to "ring 3" (or whatever the equivalent is on other ISAs.)

• Ensuring the kernel is mapped into every process's address space at a known position.

• Developing a libc where all the functions that make syscalls, have been replaced with raw calls to the mapped OS kernel functions that those syscalls would normally end up calling.

So you're still developing "userland" applications (i.e. there are still separate processes that each have their own virtual address space); but there's no syscall overhead. So a series of synchronous kernel syscalls to e.g. allocate O(N) tiny video-memory buffers, would be just as fast as (or faster than) what mechanisms like io_uring enable on Linux.


fwiw the nintendo switch has a userspace GPU driver (it's a microkernel architecture), but client apps talk to it over IPC


> I'm curious how much extra perf we could squeeze from GPUs if the applications driving them were running in kernel mode [...] could rely on directly building kernel-trustable data structures and handing them off to be rendered.

Probably not much. Applications already directly build data structures in userspace and hand them off to be rendered by the hardware; the kernel intervention is minimal, and AFAIK mostly concerns itself with memory allocation and queue management.


Exactly. In fact Mantle/Vulkan/Metal/DX12 is based on the realization that with full GPU MMUs user space can only crash its own context, so it's safe to give access to console like APIs on desktop systems, and only have the kernel driver mediate user space as much as the kernel mediates access to the CPU.


That's the premise behind this amusing presentation: https://www.destroyallsoftware.com/talks/the-birth-and-death...


You could start anew from an OS like TempleOS with a flat memory space and direct access to hardware. But it will be hard to compare until you have real world scenarios with modern apps on both platforms.


Asahi Lina is truly an inspiration for open source reverse engineering. For those not aware, they also live stream their coding sessions quite often: https://www.youtube.com/@AsahiLina

I'm excited for the day that I can easily install SteamOS (the modern one that runs on the Steamdeck) on an M2 Mac mini for an insanely powered "Steam console" for my living room TV.


I did not expect the coding live streams of GPU engineering to be presented by a virtual anime persona with a squeaky voice.


Me either.

I don't really get the V-Tuber thing (other than wanting to stay anonymous) but having code explained to me by an anime character is hilarious.


I don't mind that much thr anime character. But the voice is simply unbearable.


As interesting as it would be to watch some of these streams, it's completely unbearable to watch and listen with this persona involved. If they were trying for something that would ensure a very niche audience, this was a good choice.

I honestly find it very hard to take someone seriously who chooses this kind of persona, even though it's hard to argue with their technical ability and results.


It's just Marcan being weird and creepy.


Marcan is very busy with a lot of other things, like upstreaming patches to the Linux mainline and improving platform support.

He's been clear that he's not Asahi Lina... and he's also not a GPU hacker as far as I know.


Asahi Lina has the same spanish-sounding accent as Hector but pitched up and falsetto. "raider" - the hostname of the developer machine is that same as the hostname of Hector's machine.

He even did a super-cringe "take over" video, where Asahi Lina "broke into" one of streams: https://www.youtube.com/watch?v=effHrj0qmwk


That looks pre-recorded, and the second person talking with Lina is clearly following a script, I don't think they could do it that well on one sitting, and the animations are probably not automatic.

Probably some pre-recorded, agreed upon advertisement to promote a new channel.


She said she did that by using his stream key which she somehow obtained from his computer: https://www.youtube.com/watch?v=effHrj0qmwk&lc=UgzNfomvdiD0j...


Well then it must be legit. Who among us hasn't accidentally leaked a cryptographic secret key to a screeching vtuber who has the exact same accent as we do.


Ok ok you've got a point. I'm probably too naive to take someone like Marcan at his word.


Even if you had the stream key¹, I wouldn’t expect you to be able to take over a stream with that kind of clean cutting from marcan to loading throbber to Asahi Lina. Without any knowledge of what YouTube Live actually does about conflicting input streams, I would expect the takeover to (a) be instant, (b) fail, or (c) produce a garbled mess. What the thirteen second throbber delay would be adequate for is starting up whatever software you need to play the part of Asahi Lina.

I have no specific knowledge, but it seems very clear to me that at the very least marcan is a collaborator in the persona of Asahi Lina; and absent further contrary evidence, them being the same makes sense.

—⁂—

¹ And if stream key exfiltration actually happened, I find it hard to imagine anything but acrimony arising.


This line at the end of the article also indicates that Asahi Lina either is marcan, or is personally directly and proportionally sponsored by marcan.

> If you want to support my work, you can donate to marcan’s Asahi Linux support fund on GitHub Sponsors or Patreon, which helps me out too!


Yeah so that's called a joke.


> and he's also not a GPU hacker as far as I know.

I'm not really willing to indulge the greater discussion, but marcan has done some serious GPU hacking before, reverse engineering the microcode (not shaders) of the PS4's GPU to fix bugs that the PS4 hacked around in the drivers.


> He's been clear that he's not Asahi Lina

You cannot be this gullible. "Asahi Lina" is a pseudonym for Marcan.


Guess I am.


LOL. To be fair, his behaviour is extremely weird so I don't blame you for being confused by his insane bullshit, as I was at first.


proof they're the same person (screenshot from one of their livestreams): https://0x0.st/H-Uq.png

/home/marcan and /home/lina on the same box.


Having profiles on the same box may lend credibility to claims that they are the same person, but its very much not proof of that.


What’s your deal? What a weird denial.


I thought it must be him, but saw where he denied it publicly on Twitter, so I took him at his word.


what's your deal? what weird attempt at doxxing, this shit shouldn't be allowed here.


Is identifying a silly VTuber as a well-known person considered doxxing?


Yes.


Personally, I was more suprised by the wii sports music and all the pink.


AFAIK, that is music made by Marcan (Lina) him/her self. Amongst other talents, Marcan also appears to be musician. There are a few interviews with Marcan where you can see pianos in the background, for example here: https://youtu.be/dF2YQ92WKpM?t=989

He/she is quite the creative person I must say.


Better believe in cyberpunk futures; you're in one.


I wonder how long it's going to take for games to start generally supporting ARM. Getting Linux running well on M1/M2/etc.. seems like only half the battle for making a good gaming machine out of these.


Desktop games*. ARM is a major target for gaming already. Games are 61 percent of app store revenue and 71 percent of play store revenue. And mobile games are the majority of the gaming market (51%).

https://www.businessofapps.com/data/mobile-games-revenue/


Getting Waydroid to run well on Linux has big issues still unfortunately if you wanted to use it to play Android games. Managed to get it working for a few months on a previous Ubuntu version, now can't get it to work at all.


How many sprites to you need to add to a one-armed bandit before it becomes a game?


I wouldn't count on many developers going back to update old games with ARM support. It's more likely that the community will work to build some sort of Box86 + Proton stack to get games working, which should get a lot of the classics working[0]. From there, I think the struggle will be getting Box86 to run fast enough for modern games. Apple's ARM CPUs have great IPC, but that can still get annihilated when it's forced to simulate SIMD/AVX instructions. I assume Apple has some sort of vector acceleration framework in Apple Silicon, but it will take time and effort to reverse-engineer and implement.

Things are certainly looking better than they did a couple years ago, but getting ARM to run x86 code faster-than-native is an uphill battle. Maybe even an impossible one, but I've been surprised before (like with DXVK).

[0] Crysis on a Rockchip ARM SOC, for example: https://youtu.be/k6C5mZvanFU?t=1069


It's a little frustrating how it's the norm in the game industry for companies to toss a binary over the wall and maybe patch it for a short period after release (not a given, ports in particular are vulnerable to being forever stuck at 1.0), with significant technical updates being out of the question until it's been long enough for them to try to sell you a remaster.

Not having any experience in that industry, I wonder what the driving forces of this are. I suspect it's some combination of incredibly brittle codebases that cease to build if glanced at the wrong way and aversion to spending anything on games post-release.


>aversion to spending anything on games post-release.

I am pretty sure that is the answer. Unless the game is Cyberpunk levels of unplayable, there is no money in post release support unless it is bundled with DLC or GOTY releases.

Back in the day it was pretty commonly sited figure that like 90% of a game's revenue came in the first 3-4 weeks of release. DLC and "seasons" are an attempt to stretch it out and make more off a single release, but I haven't heard how well that works.


> DLC and "seasons" are an attempt to stretch it out and make more off a single release, but I haven't heard how well that works.

That also dates back to back in the days, we just called it expansion packs.


> I suspect it's some combination of incredibly brittle codebases that cease to build if glanced at the wrong way and aversion to spending anything on games post-release.

The primary reason is that there's no money in it. Like movies, your "one shot" game (without some sort of continuous billing e.g. mmo, subscription, continuous stream of DLCs) makes most of its revenue in the first few weeks, and once the kinks are ironed out what it makes afterwards doesn't really depend on maintenance.

Additional maintenance doesn't pay for itself, the producer doesn't pay the devs for that, and thus the devs take on the next contract to pay the bills. Not to mention additional maintenance is a risk.


Most of the time if the game was good and it’s been abandoned, someone will make a remaster or a modern take on it which now works on modern systems again


Not supporting new platforms is because

- Total dependency on an engine's build system

- Lack of official support for uncommon platforms

- Extremely low expected ROI even if it were possible to deliver on other platforms

Gamedevs aren't in the business of building platforms, they're in the business (mostly) of consuming them and going where the players are.

Gamedevs not updating is because

- The engines themselves are indeed outrageously brittle at times, with LTS releases sometimes containing significant bugs that persist against newer releases of minor and major versions

- New releases can actually cause dramatic regressions, not just in terms of bugs, but in terms of features, stability, binary size, and more

- AAAs are wasting time chasing the next big thing, non-AAAs are struggling with few people and need to constantly be building the next thing because they're building products, not services

- Gamedevs are largely media/entertainment companies, very few act like technology companies


I much prefer that model over services that suddenly disappear.


Not to mention the predatory subscription models that are rampant in mobile for software that is equally as broken, just costing more.


The "binary abandonment" model can have effectively the same result, though.

An example that come to mind immediately is how much of a mess it is to get games that were built with Games for Windows Live like the PC port of Fable 3 running on modern Windows. It's possible, but there's a ridiculous number of hoops to jump through, none of which would be necessary if Microsoft shipped a quick and dirty update that pulled out the Games for Windows Live dependency.


GFWL games are really an exception -- most games on Windows or Proton work fine years down the road.


> I assume Apple has some sort of vector acceleration framework in Apple Silicon, but it will take time and effort to reverse-engineer and implement.

I'm pretty sure it's just vanilla ARM NEON so I don't think it will take any reverse engineering. The Apple Silicon GPU is custom, but the CPU is just minor extensions to (and compatible with) AArch64. Rumour has it that this is because AArch64 was designed by Apple and donated to ARM (who Apple has close relationship with being that they were a founding member).


Interesting, that's what I was curious about. NEON is a bit slow last I checked, but at least Apple is sticking to spec here. It does make me wonder how much performance is left on the table for ARM architectures that want to emulate x86, though.

...it also raises the question of how emulated titles fare against translated ones. It would be fascinating to see how something like Dark Souls Remastered performs through Yuzu vs DXVK on Apple Silicon.


Apple support NEON (and uses it for most SIMD code) but it also has other proprietary ISA extensions for e.g. matrix multiply.


note that the proprietary extensions are extremely unlikely to gain linux support though, unless upstream arm adopts something similar

though amx won’t help out with emulation much


The problem with Box86 is that it requires 32 bit ARM, which Apple Silicon does not support.


I'd guess that will probably only happen when either windows gets widespread ARM adoption, or there's a new Xbox or PlayStation console that uses an ARM processor. Which... might be a while.


The Nintendo Switch console already uses an ARM SoC by Nvidia. But I'm not sure whether this has meaningfully increased the probability of porting games to MacOS. The Switch uses Vulkan, but Apple uses Metal, a proprietary graphics API. Whether ports make sense probably depends on how strongly the Mac market share increases compared to Windows.


The Switch can use Vulkan but it's unusual in offering a wide range of APIs, from OpenGL and Vulkan (the implementations likely derived from Nvidia's existing PC driver) or a custom low-level API tailored to the hardware called NVN. From what I gather from the emulation scene, the majority of Switch titles with non-trivial performance requirements use NVN. Even idTech, which famously uses Vulkan on PC, uses NVN instead for its Switch ports.


Interesting, I didn't know that. I thought Vulkan was already pretty low-level compared to OpenGL.

Then I guess Metal on Mac can't be such a big issue either.


There's the Nintendo Switch which does get ports.


Linux gaming has been developing in the opposite direction for a while, moving away from even x86 Linux native ports and toward running x86 Windows games under emulation.


They are running x86 Windows games under Wine. Remember that Wine stands for Wine is not an emulator.


The recursive acronym tradition of course (GNU's Not Unix, Eine Is Not Emacs etc) traditionally implied the implementation being superset or better than the thing it's replacing and referencing in the acronym.

Wine FAQ concludes

> "Wine is not just an emulator" is more accurate. Thinking of Wine as just an emulator is really forgetting about the other things it is. Wine's "emulator" is really just a binary loader that allows Windows applications to interface with the Wine API replacement.


Try to remember that hardware emulation is not the only kind of emulation.

Wine (written as WinE when I first encountered it, IIRC) emulates the Windows runtime environment.


16k pages is going to make everything hard, I'm not holding my breath.


What role does page size play here, and why is 16K a problem?


The page size dictates the minimum size and alignment requirements for `mmap`, and also for regions of memory with different levels of protection (e.g. read-only vs read+write vs read+execute, etc). If a program expects to be able to `mmap` in 4kb chunks and can't, it will probably not work properly.

On macOS, IIRC the userspace and kernel-space page size can be different and different userspace programs can run with diferent page sizes, however on Linux the page size is currently fixed across the system and set at compile time. The M1's IOMMU only supports 16k-aligned pages, so memory regions that need to be shared with other hardware (e.g. the GPU) need to be 16k-aligned. As such (and because Linux doesn't currently have great support for mixed page sizes), the Asahi Linux project has decided to run with 16k pages globally. However, that breaks a number of applications that are expecting 4k pages.

More info: https://github.com/AsahiLinux/docs/wiki/Broken-Software


Damn. There's some really important software on that list: libvirt/QEMU/KVM, LVM, WINE


I imagine that there will be a lot of work to improve this over the coming years, not just because of asahi, but the cloud ARM systems that are being developed.

I think that Fedora may be leading the pack here, see https://danielpocock.com/power9-aarch64-64k-page-sizes/


Pages have been 4k on a lot of systems for 30+ years.

That means a lot of software has come to assume that.

Certain memory buffers need to be page size aligned, or a multiple of pages long. Code can only be loaded to a page aligned memory address. Memory mapping and read/write/execute permissions can only be set on a per-page basis.

If all that stuff is hardcoded now, there will be lots of fixes necessary to make things work properly with a different page size.

And those fixes probably will need the software to be recompiled. And some software is only distributed in binary form, and getting someone to recompile it may be nearly impossible.


Sibling comments said it all, though "The Quest for Netflix on Asahi Linux", posted on HN [1] as a very good, detailed explanation of this and is a nice read.

[1] https://news.ycombinator.com/item?id=35081510


I think the Asahi project will release a 4k kernel version for those really need/want it at some point. As I understand there are no technical barriers, they're just delaying it to push more projects into supporting the 16k mode (which has better perf).


I believe 4k pages works with Asahi Linux today. However the CPU can do 4k and 16k pages, the GPU is 16k pages only. So you give up accelerated 3D to run 4k pages.


Looks like 4k support has just been added to the GPU driver https://twitter.com/LinaAsahi


Nice, I was right ... up to yesterday. Now it's fixed. Things are moving quickly.


I have Steam on a Mac, and roughly 1/3 of my library supports M1 Macs. I have some old games in my library, so that's pretty decent numbers for a relatively new platform that is 64 bit only, relatively niche in general, and extremely niche for gaming.


Buy a Ryzen mini PC with a Radeon 680M and get that now with HoloISO? M2 really isn't that fast. And as a bonus you won't have to run every game under a translation layer.


"Since the Mesa driver no longer serializes GPU and CPU work, performance has improved a ton. Now we can run Xonotic at over 800 FPS, which is faster than macOS on the same hardware (M2 MacBook Air) at around 600*! This proves that open source reverse engineered GPU drivers really have the power to beat Apple’s drivers in real-world scenarios!"

Wow that's impressive!


It is impressive, but worth noting that the macOS version was running x86 code under Rosetta. Hmm, I guess that's also impressive...


I follow Lina, Alyssa and Hector from the Asahi team on Mastodon and all three were great choices to follow. They all post interesting stuff regularly, and Lina in particular is great. She regularly livestreams her coding sessions, announcing what she'll work on ahead of time so you can set yourself a little reminder. I'm really in awe of these people.


Sincere, not kiss-ass question here: are they low-key becoming the best communicators in the Linux world? Or are there equally well-documented projects that just aren't getting the same heat for whatever reason?


My personal favorite is the This Week In[0] series of posts, if you want a simple way to keep track of notable changes. Technical blogposts are a pretty common practice among reverse-engineers, too; the Dolphin emulator has some great breakdowns[1], along with the people who reverse-engineered the Nintendo Switch's boot process[2] (and the rest of LiveOverflow's stuff).

The Asahi writeups are great, but certainly not all there is. Tons of reverse-engineering stuff and Linux documentation gets submit to this website, it just doesn't generally do as well in the ranking system.

[0] https://thisweek.gnome.org/

[0] https://pointieststick.com/category/this-week-in-kde/

[1] https://dolphin-emu.org/blog/

[2] https://youtu.be/Ec4NgWRE8ik


Sweet! Thanks for opening my mind!


I think the Asahi Linux project rekindles some of that excitement from the earlier days of Linux on PC-compatibles in the late '90s and early '00s.

Some people might not remember this, but hardware support for Linux was a real crapshoot back then, and it's only "mostly smooth" on PC today because we have a quarter century of work building out drivers and modules for the platform.

It's fun watching the breakneck pace at which they are going through the same processes with a brand new and proprietary consumer-oriented computing platform.


Yeah, I feel like they’re tackling some very hard problems with tons of enthusiasm and class.


I don't follow the space super closely so I may be mistaken, but the impression I get is that Asahi posts are more likely to be posted/shared in less niche tech-related spaces, whereas most other Linux news tends to stay firmly within the Linux/GNU sphere. So if nothing else, Asahi's communications are more generally visible.


LWN is super great


So true. Thank you.


I have been a thinkpad + arch devoted user for the last 10 years. I just want a nice ARM machine now and it seems the best option at the moment is Macbook Air M2 + Asahi. I do not know how to feel about it, maybe a bit of sadness, but I wish great luck to Asahi.

Also every blog they post are great read!


Why do you feel sad? Apple Silicon Macs are fairly open hardware, I see it as a win that there are ARM64 machines now that can run Linux and are competitive with x86_64.


> Apple Silicon Macs are fairly open hardware

If that were remotely true TFA wouldn't be about reverse engineered GPU support on Apple Silicon. Nor would there really be a need for Asahi Linux to be developed in its own silo while it gets Apple Silicon support hammered out.


This isn't really unique to Apple Silicon. The difference here is that most proprietary systems Linux has to work with on PC were reverse engineered many years ago, while AS is being reverse engineered today.

The Asahi Linux developers themselves have praised the openness of Apple Silicon, not because they have access to documentation or source code that we don't, but because it seems Apple has gone out of their way to make sure their platform can securely accommodate third party operating systems even though they have no incentive to. It's surprising in contrast to Microsoft, who has been slowly trying to make booting Linux on PCs that ship with Windows harder and harder.

I definitely think calling Apple Silicon an "open platform" is a bit of a stretch, but it's not the iron clad walled garden people think it is either.


> The difference here is that most proprietary systems Linux has to work with on PC were reverse engineered many years ago

Such as what, though?

Intel and AMD both wrote support for their systems themselves. Nvidia has long offered a proprietary driver for Linux users, and even Intel Macbooks were a shoo-in once the firmware is sorted out. It's been a long time since someone has approached a full-scale reverse engineering project like Asahi, and I think characterizing it as "non-unique" undersells the amount of bespoke work here.

Apple made the right move by continuing to allow third-party OSes, but that's not equivalent to building out support. The work required to bring up a black-box SOC is hugely distinct from using first-party drivers to boot into Linux through UEFI.


> Such as what, though?

Drivers for WiFi, audio, Bluetooth, a heap of I2C devices like keyboards on laptops and temp/fan control, graphics cards, and much much more.

Not a single company “built out support” for all these things. And none of it is covered by some common interface — each must be reverse engineered (or implemented following reference manuals, if they are available). Intel and AMD did not provide support for these, because they can’t — the processor architecture is oblivious of these peripherals.

I think you’re underestimating how much volunteer work has been done to get Linux to be usable on any machine. From your wording, I suspect you may think there are some grand unifying abstractions that, when implemented once, provide compatibility with most machines, and that Intel and AMD did just that. But that would be mistaken.


> I suspect you may think there are some grand unifying abstractions that, when implemented once, provide compatibility with most machines

UEFI provides some of this, but UEFI didn't become ubiquitous on consumer hardware until the 2010s.


I'm comparing it more to the state of Linux on PC in the '90s and early '00s, when most vendors didn't care about Linux unless you were buying a server. Getting Linux running on a laptop back then was often a mess of hacky reverse engineered drivers, sometimes with incomplete or missing functionality.

Of course what Asahi Linux has undertaken still feels like a bigger and more impressive task, I'm just saying that this kind of work is not entirely unprecedented. The current Linux ecosystem on Apple Silicon is more comparable to that of the PC Linux ecosystem from 25 years ago than from today.


ThinkPad also has proprietary undocumented subsystems that are reverse engineered to work on Linux. Such double standards versus Apple.


> ThinkPad also has proprietary undocumented subsystems that are reverse engineered to work on Linux. Such double standards versus Apple.

The differentiator for decades now is intel-based laptops, including thinkpads, have been well-supported by mainline kernels including GPU support. Support that Intel has directly funded development and maintenance of.

Where are the @apple.com commits supporting Apple Silicon in mainline Linux?

Even AMD is better as of amdgpu.


Well supported by mainline kernels, thanks to reverse engineering undocumented proprietary Lenovo hardware. ThinkPads contain other components besides the GPU. Apple is not in the business of selling GPUs to other hardware vendors and the fraction of customers demanding Linux support is miniscule. Intel supports Linux as a business decision based on profitability, not open source idealism.


What components? Lenovo sells several ThinkPad models with Linux out of the box. The fingerprint reader is a source of trouble but it's not a Lenovo part.


Battery, BIOS, touchpad, power management, touch screen, fingerprint scanner, Nvidia GPU, etc.

Believe it or not, they sell hardware with Linux OOTB thanks to others reverse engineering their proprietary subsystems.


Over the years I've seen numerous thinkpad tweaks for wifi, sleeping with the lid closed, restarting wifi on lid open, numerous tweaks for the function keys (audio up/down/mute, brightness up/down), clock speed/thermal management, reading battery levels, etc.

Seems like the #1 reason thinkpads are well supported is that they are relatively popular among the people who modify the OS and kernel for compatibility.

I don't recall that Lenovo is a particularly big contributor to the linux kernel.


> Well supported by mainline kernels, thanks to reverse engineering undocumented proprietary Lenovo hardware.

Such as? What "proprietary Lenovo hardware" is obfuscating my boot process, I'm really curious now.

> Intel supports Linux as a business decision

Yes. Intel supports Linux because the concept of selling Unix doesn't work. Take it from Apple, who gave up on selling their OS after realizing that people were really only in it for the hardware. If Intel is the begrudging neighbor to Open Source software, then Apple is holding them in a Mexican standoff with their userbase. Somehow, Intel's "business decision" manages to be the more civilized relationship between the two.


Sure it has some undocumented subsystems, but compared to the Macs, it has a much stronger claim to openness. Apple has released virtually no information on the internals of these machines, and they aren't standard PCs like the Lenovos. Calling them 'open' is absurd, it would be hard to imagine a publicly released general-purpose computer that is less open.


> but compared to the Macs, it has a much stronger claim to openness.

False. Lenovo hardware contains a lot of proprietary undocumented parts that required reverse engineering to get working on Linux/BSD. The parts that didn't require reverse engineering (Intel GPU) were not made by Lenovo.

Apple hardware is "open" as far as they don't try to prevent other operating systems to be installed. Apparently they made it reasonably straightforward for the Asahi team.


> False. Lenovo hardware contains a lot of proprietary undocumented parts that required reverse engineering to get working on Linux/BSD. The parts that didn't require reverse engineering (Intel GPU) were not made by Lenovo.

How is it false? Whether the parts were made by Lenovo or not is irrelevant. It may not be 100% open (and this is probably not due to parts that Lenovo themselves created), but it is substantially open, which is far more than can be said about the Macs, which are virtually undocumented system-architecture wise, and who knows what Apple will do in the future to hamstring efforts to use them outside the walled garden.

> Apple hardware is "open" as far as they don't try to prevent other operating systems to be installed. Apparently they made it reasonably straightforward for the Asahi team.

That is not "open", it's just "not openly hostile to reverse engineering...yet".


> but it is substantially open

No it is not. I'm amazed that people just don't get this simple fact. The drivers were reverse engineered. ThinkPad is not an open platform. It contains some Intel and AMD stuff that is open, but you can't give credit to Lenovo for that.


You keep saying that, but you have yet to provide any evidence. What essential drivers were reverse engineered, exactly. As far as I can tell, everything required to boot to a working graphical desktop is well documented.

I'm not giving credit to Lenovo, I'm saying that the platform is mostly open because it is based on mostly open components. In contrast to Apple's devices, where the platform is closed because it is based on undocumented components. You could say the same about pretty much any standard PC, Lenovo is just one of many vendors, I have no idea why they got singled out here.

But they do sell boxes explicitly qualified and supported to run Linux, and they do contribute to the Linux kernel development process.

The fact remains that claiming any of Apple's hardware platforms are remotely open is laughable.


Google it

"ThinkPad linux driver reverse engineered" or "lenovo linux driver reverse engineered"

Power management, I2C devices, touchpad, touchscreen, audio, WiFi, bluetooth, etc. Many things besides the GPU. Maybe some of it is open now (Intel parts) but that wasn't always the case.

As another smart person (smoldesu) pointed out here, Lenovo has recently started contributing some updates to the kernel, but the vast majority has been reverse engineered over decades.

The attitude of Linux devs was always to accept that hardware is proprietary/undocumented and get to work on reverse engineering. Then users take it for granted that stuff just works and have no appreciation of the effort that it took to get it working.


> Apple hardware is "open" as far as they don't try to prevent other operating systems to be installed.

1. This is not true for "Apple hardware" as a rule.

2. Removing support for third-party OSes would be a shocking product regression for the Macbook.

3. If Apple's definition of "Open" excludes any transparent documentation or explanation, then they have provided precisely nothing.

You contradict yourself by praising Apple for keeping standard features while deriding Lenovo for doing the same thing. All of this ignores the Linux certification Lenovo offers on their products, their Linux support contracts and even the freely-provided firmware updates through fwupd (something Apple will never provide). Regardless of whether you characterize "open"-ness as non-hostility or constructive support, Lenovo is still the more open company by a country mile. And Lenovo doesn't even do that much to-boot.


>You contradict yourself by praising Apple for keeping standard features while deriding Lenovo for doing the same thing.

Putting words in my mouth. I never praised Apple and I never derided Lenovo. I am simply stating the facts. I am trying to explain that both companies have the same approach. Neither of them are open source idealists. Lenovo is not more open than Apple. Apple is not more open than Lenovo. I am pointing out the double standards and hypocrisy in this thread. I have owned many Lenovo and Apple products and I'm not a fanboy of any company.

> 1. This is not true for "Apple hardware" as a rule.

It is true for their laptops and desktops. iPhone/iPad are not relevant to this discussion.


> Lenovo is not more open than Apple.

This is straight-up untrue, though. In this specific situation, they are markedly more open than Apple.

Here is their commit for ACPI support: https://git.kernel.org/pub/scm/linux/kernel/git/rafael/linux...

Here is their commit for always-on USB power: https://git.kernel.org/pub/scm/linux/kernel/git/pdx86/platfo...

Here is the official hwmon patch for an otherwise unsupported laptop: https://git.kernel.org/pub/scm/linux/kernel/git/groeck/linux...

Lenovo is doing what Apple doesn't, and publishing their contributions as GPL code. In this particular arena, they are provably more open in the sense that they make official Linux contributions and Apple does not.

I too have owned hardware from either company, and have plenty to complain about for both. One thing I cannot deride is the quality of first-party Linux support for my Lenovo hardware. It's not perfect and they're an ill-fit successor to IBM, but they make marked FOSS contributions that other companies would refuse. Because these changes are made freely available with an Open license, I think it's fully fair to say that Lenovo is shipping more Open systems than Apple is. Like I said in my other post, they don't even have to do much to cement themselves in that position either, just offer a few of their own patches.

> It is true for the current hardware. [sic]

> It is true for their laptops and desktops. iPhone/iPad are not relevant to this discussion.

Ah, there's the caveat. We can agree to disagree, frankly I'm more interested to see where the legislation takes this.


I am neither an Apple lover or an Apple hater. I think Apple produces extremely well polished products thanks to their vertical integration and have innovated in hardware pushing the frontier, especially with the ARM suites. However, they also have a fairly closed ecosystem in which they are trying to overprice things when they can abusing their dominant position. I would feel sad to contribute to its success but I also selfishly really need the best tool available.


Apple's computers are actually quite competitive when comparing price-to-performance. Beyond that, they have great build quality, battery life, components, and support which all adds up to being a great package for their cost. Seriously, this whole "Apple is too expensive" talking point died long ago. When the MacBook Air first came out, it was $1,800 ($2,500 adjusted for inflation) with a slow Intel C2D proc. Now you can get an M1 MacBook Air for $1k or an M2 for $1,200.


I'd argue price/performance is where they are the weakest. Especially when you want more than the bare minimum ram or ssd. An Apple replacement for my current notebook would cost over $5k and I spent a fraction of that.

Battery life is really the core differentiator at the moment.


Did you take into account the different RAM architecture? Sure you can get bigger RAM, but that’s not nearly the same as what the M series offers (and it shows on benchmarks) - you might be better off with a 16GB MacBook, than a 32GB other laptop. Oh, and it’s not like this form factor/higher quality displays, etc would be cheap by other manufacturers.


It's on-die, but it's not really any faster. The latency and bandwidth are pretty OK by today's standards. I suspect it's on-die because M1/M2 grew out of mobile CPUs. You might be referring to fast SSDs, but that's mainly true only for Pro versions. People who need >32 GB RAM usually know why they need it, you cannot really be running simulations out of your SSD swap. I'm not saying it's a proper use-case for a MacBook, I'm just saying that this guy might have some special requirements that do not align well with Apple laptops price-wise.


Bandwidth on the macs is pretty good. M2 = 100GB/sec peak, M2 pro = 200GB/sec peak, M2 Max = 400GB/sec peak.

A $3k lenovo thinkpad p16 uses DDR4-4800 or 76.8GB/sec peak. That also ignores the arm relaxes memory model, which means you get (on average) a greater fraction of peak bandwidth when running something memory intensive.

So apple does 1.3x, 2.6x, or 5.2x better. On a desktop you can get another 2x with the M1 Extreme. Seems quite a bit better than "Pretty ok", it's a big part of why the apple's get great GPU performance compared to Intel/AMD laptops with an iGPU and run at a small fraction of the power of the dGPUs used in laptops.


RAMs on M1 are in separate die in the same package. This let's Apple make fewer die SKUs, as well as use differently optimize die processes.

https://www.macrumors.com/2021/04/06/m1-mac-ram-and-ssd-upgr...


> An Apple replacement for my current notebook would cost over $5k and I spent a fraction of that.

What if you price out a replacement with the full RAM and SSD capacity you want from the OEM rather than as an aftermarket upgrade? I think the problem usually is not so much "Apple over-charges for upgrades" as it is "all OEMs over-charge for upgrades", with a side of "Apple uses non-upgradable storage".


> What if you price out a replacement with the full RAM and SSD capacity you want from the OEM rather than as an aftermarket upgrade?

Moot point. Of course if you tie your hands behind your back your options will be limited. The point, for the parent, is that they aren't limited by insane markup pricing.


It's important to correctly identify the underlying problem and whatever tradeoffs are involved. It's unproductive to bitch specifically about Apple's expensive storage and memory upgrades when it's actually an industry norm. It might be more fruitful to discuss why OEMs in general are able to get away with such steep upgrade pricing, and it's definitely more interesting and appropriate for HN to debate the pros and cons of Apple's soldered memory and storage.


It’s also important to realize that no one was “bitching” in this thread. It was claimed that the price price wasn’t all that bad, to which someone raised a counterpoint.

The reality is that, if you need a lot of RAM and SSD space, it’s going to cost you a lot more than buying a laptop and replacing the RAM and SSD yourself.

If someone said that the price of SSD and RAM in, say, a System76 laptop was outrageous and that’s why they won’t buy one, that would be a bit silly since they can upgrade those themselves.

What you can’t do is perform a RAM or SSD upgrade on a MacBook. So it’s a reasonable issue to have with their pricing.

To throw one more datapoint in: for my own development, I have to closely manage (closing and reopening stuff constantly, paying the cognitive overhead of context switching as I go) just to keep RAM use between 32gb-64gb — use never managed to go below the former, and the latter is the total my laptop can support. I’m usually sitting around 90%-95% utilization. So 64gb is an absolute minimum for what I can reasonably get away with (and I’d be much more productive if my laptop had the same 128gb my desktop has).

Some people just need as much RAM and storage they can get their hands on, and that quickly makes the MacBook a really expensive option. No bitching (really, no sentiment at all), just facts and reasoning.


meanwhile new 4000 series laptops are routinely costing more than MacBook pros, apple is winning the price/performance category in every style of laptop, as much as you may not like it.


The base models are reasonably competitive. But if you want to upgrade anything then they become extortionate.


> However, they also have a fairly closed ecosystem in which they are trying to overprice things when they can abusing their dominant position.

They have ~16% of the global market. In no way do they have a dominant position.


Not wanting to start an anti-apple thread but I meant dominant position in their closed garden. Is that an abuse or not, I'll let the litigators decide.


As another person who loves free software, I'm a little sad that after so much progress has been made in other areas, when it finally looks like x86's dominance of the desktop and laptop CPU markets is starting to slip, the most attractive contender for a non-x86 Linux laptop is any proprietary platform, Apple or otherwise.

And I do have a distaste for putting money in Apple's pockets for various personal reasons, from the huge pile of cache they're sitting on to the way they repeatedly updated my long-dead iPod's firmware just to break its compatibility with open-source tools for syncing music to it way back in the day to the upstream-hostile forming of WebKit from KHTML to their utterly cynical use of 'open-source' as a marketing ploy when they launched OS X. So if I do get an Apple Silicon Mac, it will have to be used even though I wouldn't otherwise want it to be.

Because most other hardware vendors suck, too, there are still things Apple could do, short of becoming some kind of open hardware company, that would make me reconsider that general sense of hostility I've gotten from them over the years, which I acknowledged in this other comment on this post: https://news.ycombinator.com/item?id=35233479

I just wish my interest in the Apple Silicon option could be wholehearted enthusiasm, instead of something complicated by my relationship to a company whose ethos screams that my values are unimportant on a bunch of different levels.


Not the commenter, but I'm a little sad ThinkPads didn't keep up.

They have an ARM64 offering, but it's PlaySkool premium.


A someone with an M1 pro, keep in mind that there's no support for external monitors aside fron the main laptop display, yet.


using an external monitor on an m1 pro at work.


On Asahi Linux at work?


Are you talking about support under Asahi because MacOS definitely does have support:

> Display Support

> Simultaneously supports full native resolution on the built-in display

> at 1 billion colors and:

>

> Up to two external displays with up to 6K resolution at 60Hz at over a

> billion colors (M1 Pro) or

> Up to three external displays with up to 6K resolution and one external

> display with up to 4K resolution at 60Hz at over a billion colors (M1 Max)

>

> Thunderbolt 4 digital video output

>

> Native DisplayPort output over USB‑C

> VGA, HDMI, DVI, and Thunderbolt 2 output supported using adapters (sold

> separately)

>

> HDMI digital video output

>

> Support for one display with up to 4K resolution at 60Hz

> DVI output using HDMI to DVI Adapter (sold separately)

https://support.apple.com/kb/SP854?viewlocale=en_US&locale=e...


The topic here is Asahi Linux. Obviously external displays work on MacOS.


And the end goal is to upstream all the work so that we can run, for example, Debian in Macs?

Also, is anyone else afraid of the possibility of Apple deciding to screw us up by imposing restrictions to prevent people specifically from doing this for...reasons?


> Also, is anyone else afraid of the possibility of Apple deciding to screw us up by imposing restrictions to prevent people specifically from doing this for...reasons?

According to Marcan, Apple explicitly went out of their way to support secure booting of other OSs as well.

Also, it’s hard to predict, but I think it would only increase revenue if the small, but rich linux-using software community would choose MacBooks as “the next thinkpads”, and it’s not like most people would not just have both OSs available and switch between them.


As I understand it, while the core pieces required for Linux to run on Apple Silicon will be upstreamed, there are parts that smooth the experience out and make it more practical that are unlikely to be integrated into other distributions, which necessitates Asahi's continued existence as a distribution.


It should be noted, Marcan has said that the ultimate, eventual goal is for Asahi to no longer have to exist as a distro. That may be a long way off however!


This is a similar model to the KDE Neon distribution, which upstreams basically everything but gives some benefit to developers working on the KDE project.


I love reading things like this that reinforce my belief that I personally know very little about anything. _back to writing CSS for me :)_


The key to understanding this world is that nearly everybody does these things full time for their own interest. Trying to accumulate this level of expertise on nights & weekends is an express train to burnout. On the other hand if your financial needs are satisfied (worked at FAMANG for a decade while being frugal, doing contracts for 1/3 of the year & being frugal, married a doctor, got into cryptocurrency early, donations if you're anomalously popular) and you're reveling in the joy of self-indulging exploration, learning, and tinkering without having to worry about giving an update at daily standup tomorrow, it's amazing what you can accomplish. Knowledge is cumulative. Within a few years many people could be at this level. This is part of why UBI is so popular within software engineering circles, because you really don't need that many resources to just be in it for the love of the game.

On the other hand having kids is basically anathema to being able to live this life. So you are choosing work (in a broader sense of term than conventionally used) over family.


I'm the father to a 27 year old, step-father to a 31 and 33 year old. I started my journey into audio software when my daughter was about 2 years old and I was a stay-at-home parent. I'm often considered one of the most senior people in audio software development in the world now (a bit of an illusion caused by the invisibility of people inside proprietary development processes, but I'll take it).

I can take you on a deep dive into every aspect of audio software. My kids and my wife did not obstruct that, and if anything, having responsibilities towards them forced me to be even better at the process that got me to where I am today.


Sorry, I should have specified: my post was specifically concerning FOSS development. Of course people inside large companies developing proprietary software can attain a high level of knowledge about whatever field even if they have families, because they're working on it full time over many years and being paid for it.

Although it's unclear, perhaps you are talking about FOSS development (actually looking up Ardour from your profile, yes you definitely are). That's very impressive, then! So you did manage to juggle having a full-time job, FOSS development, and a family?


I was initially financially independent (due to amzn), playing the role of stay-at-home parent and slowly expanding the FLOSS project's role; in about 2008, the FLOSS project became my only source of revenue.


Thanks for posting this. I see kids in my not too distant future and while I personally don't believe this I see lots and lots of comments on the internet saying that having kids basically stops everything else dead in its tracks for ~2 decades.

There are so few people with experiences like yours that it's sometimes hard to not let that get into my head.

So it's refreshing to see a different opinion.


A lot of generalities don't really apply to you if you were one of the two other people building Amazon.com in Bezos' garage: https://www.wired.com/1999/03/bezos-3/


Amzn allowed me to (a) be a stay-at-home parent (b) work on a FLOSS project for several (maybe 10) years without needing to generate any revenue.

It did not have any impact on my curiosity or learning process during the journey I've been on for the last 25 years.

The point is the the presence (or absence) of children is not determinative of your ability to deep-dive. Or so I am claiming.


Right! My original quip about children was more about financial constraints than anything. Not trying to shame you or anything, I think using the Amazon windfall to work on FOSS is absolutely the dream and one of the best possible ways you could have contributed to the world.


It's a fair point that being able to tinker on FLOSS while my kid(s) were at school rather than having a full time job and leaving the tinkering for late at night was probably quite beneficial.

Nevertheless, I would still like to claim that kids or not is not determinative of your ability to do the deep dive.

I'd also like to think that some distance in the future, it will be raising my daughter will turn out to have been the best possible way I contributed to the world. This software stuff is, in the end, a distraction for the most part (especially when there are/were already so many options in the particular niche that Ardour occupies).


I think swathes of free time (which sounds like you had at Amazon) with days on end of no meetings or commitments (or childcare) are the key.

I just did 5 months between contracts. The kids were at school for 6 hours a day. It took a good few weeks for the ideas to start flowing and my curiosity and enthusiasm to build, such that I actually started writing code and enjoyed the process.

It’s something I never experienced before with regular holiday leave, where all-day childcare would usually be involved.

It’s only recently I’ve been in a financial position to be able to create free time like this. It was really expensive though, and I’m unlikely to be doing it again for a few years.


I don't know what I expected, but it certainly wasn't that. Thanks for letting me know. Though I really don't know what to make of this now (if anything).


Maybe see my reply adjacent to yours in this subthread.


The big question I have is whether this can possibly support mandatory Vulkan features that are not available in Metal. The one I care about most is device scoped barriers, which in turn are needed for single-pass prefix sum techniques.


From what I read on IRC full Vulkan is definitely the goal. And with some hacks should be supported by the hardware.


The specific thing I asked for is very technically challenging. One of the things I can offer is some tests that fail in, among other things, MoltenVk, as the mapping to Metal barriers is imprecise. I'm not sure whether the Vulkan CTS test will catch it, I've seen it miss memory model failures on other hardware. I'm always happy to chat about this if someone from the team wants to reach out.


Alyssa and Lina are very responsive on IRC. #asahi-gpu on OFTC IRC network. You best ask them. Whether metal or moltenVk supports it doesn't matter. The hardware is capable of more.


Does anyone pay these people for their awesome work ?

Every time I see their progress in a such undocumented space, my jaw drops. Huge respect.


Most of them accept donations:

> If you want to support my work, you can donate to marcan’s Asahi Linux support fund on GitHub Sponsors or Patreon, which helps me out too! And if you’re looking forward to a Vulkan driver, check out Ella’s GitHub Sponsors page!

Lina also accepts donations on her streams. I think Alyssa is funded by her employer, but I'm not sure.


[flagged]


Are you sure about this? The Asahi about page (https://asahilinux.org/about/) lists Hector Marcan and Lina as separate people. I assume Asahi Lina is a pen name, but I don't see anything to suggest it's Marcan's alter ego?


If you took that literally, and actually believe this is a real separate individual, and believe this is their real voice, then I don't know what to say to you.

https://www.youtube.com/watch?v=effHrj0qmwk

This is how its done. Voidol2

https://crimsontech.jp/apps/voidol2/?lang=en

Demo: https://www.youtube.com/watch?v=Wfid5kXxjY4


That’s… odd. I just figured from the name that it’s someone who wanted to protect their privacy, but that’s certainly, uh, some persona.

If it is the creator, my charitable interpretation would be that it’s a form of experimenting with identity and gender.

It doesn’t rub me well though when OSS projects list fake contributors. (Even projects as neat as Asahi.) I mean, morning me, afternoon me, and late night me are vastly different people, but I can’t pretend to be a three-man team just because of that!


On the internet, no on knows you're a dog. Who cares?


[dead]


Yeah, who cares? There's one of you folks soiling every good technical discussion about Asahi.


[dead]


Couldn't care less... even to click the link. Everyone has their quirks. What is wrong in your life that someone having weird fun is a factor in anything? Reminds me of the Michael Douglas character in Falling Down.


It’s kind of hilarious how the entire conversation about CPU’s have steered towards Apple’s chips. No one talks about or mentions AMD or Intel chips anymore outside of gaming circles…


Are there server circles? I'd guess they talk about intel and AMD more than Apple.


Definitely are, lots of talk of general ARM usage in the server space in those circles, particularly when it comes to AWS' Graviton, or other hardware/cloud offerings.


Because it is kinda boring? Just like M2 is much less interesting than M1. Current gen Intel or AMD chips are like 10% to 20% faster than apple (which apple acknowledges in their marketing and hence points at performance per Watt). Intel uses about the same energy at idle but about double at load. AMD has about the same efficiency (work per Watthour) under load but worse idle. And now nothing is happening until the next release.

> No one talks about..

Yeah, nobody cares unless something new comes out.


Good grief, HN. Downvote me too while you're at it, I wouldn't mind being karma-locked out of this site again for a few more blissful hours.


you live in a bubble


Noob question: given that Zink exists and works quite well, wouldn't it been simpler to implement a Vulkan driver first and then just use Zink for OpenGL?


I think Zink requires a fairly feature-complete Vulkan implementation. Starting with a basic OpenGL implementation is definitely the quicker route to having a way to run and test some real applications.


Zink docs list a bunch of Vulkan extensions you need to support to be able to run Zink. The Raspberry Pi's v3dv Vulkan driver doesn't even support all the requirements, though it is only missing 1 extension.

Alyssa also works on the drivers for ARM Mali GPUs, which currently don't even support Vulkan 1.0.

https://docs.mesa3d.org/drivers/zink.html#opengl-2-1


To add to this, the faster route to something that runs was important because it was a goal of the devs to start dogfooding the distribution as soon as possible.


I THINK that I remember marcan addressing this in one of his youtube interviews:

This is actually what their plan is long term. That said in the short term it's way easier to implement opengl, it gives them a simple way to explore the hardware, and it also makes it so that real people will be able to run desktop linux on apple silicon macs way sooner.


I'm curious what rendering path Xonotic uses on macOS. Is it OpenGL? Apple supports that, but they run it on top of Metal, which leaves it buggy and with performance that's nothing to write home about.


Wow:

> So what does this all mean for users of the Asahi Linux reference distro today? It means… things are way faster!

> Since the Mesa driver no longer serializes GPU and CPU work, performance has improved a ton. Now we can run Xonotic at over 800 FPS, which is faster than macOS on the same hardware (M2 MacBook Air) at around 600*! This proves that open source reverse engineered GPU drivers really have the power to beat Apple’s drivers in real-world scenarios!


Ahhh this is so awesome! I'm now kicking myself for upgrading to an M2 Max thinking all M2 models were supported already, now I have to wait to see this massive performance improvement. Fantastic work by the Asahi team, it's great to see this come together and the blog posts are excellent summaries of information I otherwise wouldn't be able to grok.


This gonna sound like but I promise I'm not being facetious or trying to make a point. I'm genuinely curious. Whats the point of Asahi Linux? Why buy a Mac to run linux?

If you're spending money on Mac I assume you want to buy in the whole MacOS environment, that's Apple value proposition in my eyes.

Is it the M1, it that fast and better than similar priced laptops running an x86-64? Or is it the novelty of using ARM-based stuff? Is the market for ARM-based laptops still Apple only?

Also is there relevant limitation on stuff you can't do on MacOS through homebrew or something and can on a Linux distro (not a mac user so I don't know).


The ARM Macs are seriously impressive hardware, even for just the build and tactile quality. But macOS is regressing quickly for professionals, there's too many design compromises to make the OS attractive and safe for 'casual users' but those same features are starting to become a hassle for professional users. Linux is the complete opposite of course, it's a hassle for casual users, but in exchange gives complete freedom to do what you want. Personally I'm still ok with macOS, but with each new macOS release the grass is looking greener on the Linux side ;)


Speaking as a Mac M1 users, but not a Linux user (I have played with Asahi, think it's great, but don't need it right now), I think almost no one (or a very small minority of people) buys Mac hardware to run Linux. I think it's the inverse, people buy Apple hardware and then realize Linux is available and want to run it.

Yes, I think the general opinion is that Apple's M1 and M2 platforms are superior to Intel, even at the Mac's (supposedly) higher price point.

Though MacOS is a complete Unix system, it is still proprietary, and there's nothing wrong with wanting to do your work (or play) on a free OS running on an excellent hardware platform. Asahi is giving people the opportunity to do that.

Finally, though I can't speak for the Asahi team, I think also there's an element of "because it's there". Here is a great new hardware platform offering a incredibly difficult challenge to a group of people who live for this kind of thing. Why would they not want to do it?


I buy “windows” hardware to run Linux.

I would buy an iDevice if I could take it home, boot it up to make sure it works then install Fedora with everything working.

Even my current laptop failed that since I wanted to play with GPU programming and I had to hunt down drivers for the the AMD APU — which I never got working 100% correctly but that was probably my buggy code, GPU programming is hard.


It's the hardware that's the key attraction. I have a linux laptop. It's slow. Ugly. Awkward to use (keyboard, trackpad, etc). Etc. And I have an M1 macbook pro for work, which is the opposite. It's just a really nice laptop to use. Basically, unrivaled by anything PC based currently. A few vendors come close. But not all the way. And if you like using Linux, having a really nice laptop to run it on is a good thing.

I tend to run the same kinds of tools on both laptops (open source ones).

The Apple software experience matters less to me these days. I spend most of my time switching between the same applications that I would use on Linux and I mostly ignore all the iApps that come with macos. Beyond finder and preview, there aren't any Apple applications that I regularly use or need. Mostly I don't care about M1 vs. Intel. I'm not a native developer and all the stuff I care about is available for both cpu architectures. I just need the OS to get out of the way and allow me to do my thing. I used the linux laptop extensively for a while when I was without a Mac last year. Works great as a daily driver.


> Is it the M1, it that fast and better than similar priced laptops running an x86-64

That's definitely part of it. You probably need to include battery life for it to really make sense. There's nothing else that will give you that performance and close to 20 hours of battery life in a slim laptop form factor.

There's also people who are mostly happy using macOS but may want to boot into Linux for specific tasks.


Currently M series laptop has the best TDP in the market and great hardware. Now i just want a good linux laptop


Maybe you want to dual-boot? You deploy on Linux and want a more production similar environment for investigation but prefer OSX for day-to-day work. Or maybe you work on OSS and want to validate cross-platform issues. Or maybe you were work-issued a mactop and prefer Linux. Or maybe you just like the challenge of porting Linux. The reasons are plentiful.


  Also is there relevant limitation on stuff you can't do on MacOS through homebrew or something and can on a Linux distro (not a mac user so I don't know).
- $Dayjob bestows a Mac, MacOS is fine but seems to have stagnated due to focus on iOS

- i3/Sway type window managers are more comfy

- Homebrew is hit or miss

- Apple seems to make the best bang-for-buck ARM laptops at the moment

- Asahi is a fine beer :)


Best hardware + most flexible OS.


There's two aspects of the apple advantage for me.

The first is apple hardware is reliably nice. It's just done right, with minimal corners cut. Full aluminum body (not plastic), great screen (bright, high resolution, great color (accuracy and range), great keyboard, great touchpad, amazing CPU (fast and low power), very nice GPU (better than any other integrated GPU and much lower power than any faster discrete GPU), with a great memory system (100, 200, or 400GB/sec).

Sure the best laptops some close on some metrics, but generally have lousy iGPUs or lousy battery life. Laptop memory systems typically max out at 75GB/sec, which drives the need for discrete GPUs, which drives the need for larger batteries. Sadly FCC limits max batter size, so you end up with terrible battery life and performance if you aren't plugged in. Or they have are great, except for a poor screen. Or have a nice screen and a lousy keyboard or track pad.

I also find it interesting that if you try to buy a 3 or 5 year old laptop, but far the most expensive are apple laptops. Almost as if they are built better.

My second issue is I find OSX to be MUCH less intuitive. The lack of a real package manage for the OS is painful. Having to add my own, like brew is ugly. Then the UI inconsistencies drive me nuts. Even things like cut/paste (control-c/v) are annoying. Doubly so when using iterm2 where you don't need the control-c. Or if running xquartz I'd need a state diagram to map all the copy/paste rules.

Linux seems much more sane, granted I'm more familiar with it. I'm on a website and want to drag an image in, I just drag the image from any image viewer or file browser and it works. I can drop files into signal by dragging. Or documents into thunderbird as attachments. OSX seems much more finicky about dragging objects between applications, and I end up in the finder, which I find particularly counter intuitive. Oh, sure I could try to remember that bang, splat, apple A goes straight to apps. Seems that browsing from ~ should be WAY easier, or even better just have better drag/drop supports.

Linux just makes more sense to me. 99% of my installs are apt install <appname>, not playing the do I search for the web for a DMG, or try to find it in brew, crap that was a different user, or maybe I need a 3rd app store? Oh that works in a brew cask, but not a brew app, etc. etc. etc.

I also find apple's handling of multiple monitors quite annoying. I don't have anything fancy, just N desktop of 2 monitors each, with a quick keyboard combo to switch (control-alt right or control alt-left). I ask apple folks about their setup and they seem to always mention some weird paid app that worked with the N-1 version of the Apple OS, but the devel got bored and stopped updating.

If you just need Signal, Teams, chrome, Firefox, terminal, and like a nice coherent package manager, drag and drop (not select, control-c, select, control v) of strings, images, documents I'd recommend Linux. I find OSX frustrating to use.


> We could add explicit sync to them, but that would break backwards compatibility…

At some point it would be good to break it and get rid of implicit sync in the Wayland use case. Keeping it forever is not worth it.


Well the question is then do you have to update the Wayland clients?


Probably. May be it's like Wayland-new and similar transition as from X? So they do need to coexist at the same time somehow, but also it can be a separate path.


Sometimes I really get to thinking (more than I usually do) about how much it sucks that so much hardware is completely undocumented and must be extensively reverse-engineered to even function outside of the realm of manufacturer ordained software.


If they make it possible for me to revive old Apple hardware I’m going to be very excited about this project.


Asahi Linux is geared specifically for Arm based macs. There won’t be “old” hardware for another several years.


Asahi means morning sun. I was curious as there is a beer with the same name.


Does this mean that SteamVR can run on Asahi Linux and macbook now?


SteamVR requires a Vulkan driver whereas this post is just about paving the road to build such a driver. The only working driver at the moment is the OpenGL one.


Fandaniel is really knocking everything on its path!


disappointed in the moderators of this thread


I think better time would be spent getting Full Vulkan on MacOS with proton support. The amount of people wanting to run Linux on Macs is waaay lower than with Windows PCs.


Then go ahead and do it, but why should the Asahi Linux people take this into consideration? The entire point of the project is to get Linux running on ARM Macs. That's what they personally want and there's no reason for them to make market share considerations.


With all due respect, no offense, but I find comments like this a little bit disrespectful.

These are a couple of volunteers, working for free, and you're saying that it'd be better for them to volunteer their time for the benefit of a huge trillion dollar corporation and work on something that the aforementioned corporation explicitly does not want (but could very easily do itself).


Hard disagree. Apple Silicon is by far best in class, but the OS it ships with feels like a toy to me. Can't wait for the hardware to be liberated.


Why does MacOS feel like a toy? It's the primary OS for a huge number of developers (at least in the Bay Area) who are doing plenty of production work on it every day.

Fully support bringing a well supported Linux distro to the hardware but MacOS is nothing like ChromeOS or some sort of thin client operating system.


Not GP but I also believe that MacOS feels like a toy.

All the icons are jumping around everywhere, everything is colourful, it's 2023 and you can't maximize a window or make it take up half the screen, etc.

I also hate the touch bar but I'd probably get used to that. I used a pre-touchbar Air for a few months and going back to windows felt amazing.

I am eagerly waiting for Asahi to be usable as a daily driver, my next laptop will probably run Apple silicon.


> You can't maximize a window or make it take up half the screen

Yeah, the first thing I do when I get a new Mac is to install some tools to make the experience less annoying.

AltTab: https://alt-tab-macos.netlify.app/

Rectangle: https://rectangleapp.com/

Dato: https://apps.apple.com/us/app/dato/id1470584107

Caffeine: https://www.caffeine-app.net/

Paragon NTFS Driver (free for Seagate disks): https://www.seagate.com/support/software/paragon/

--

In my own experience, I found out GNOME 40+ to be more polished UI-wise than MacOS. I was really surprised when I installed Fedora Silverblue the first time. And also I'm a bit upset that MacOS seems to have stalled, and when Apple adds changes is to make it look more and more like iOS.


I hate the idea of requiring 6 random binaries from 6 random companies just to get the basics working like tabbing windows, resizing windows, and reading drives.

Doubly so when they are closed source and free. How are the developers making the money? Are they community supported somehow, maybe patreon? Are they tracking you? Inlining ads to your desktop? Who is paying for their apple devel tools or apple store ID?

As the saying goes, if the app is free, you are the product that company is selling

Imagine supporting 100 mac users, each with 6 random binaries installed.


Mouse over the green window button, hold down option and you can maximize, and tile to the left/right of the screen.


Not being able to tile more than 2 windows at once is actually pitiful for a modern OS. I really like using macOS with a tiling window manager installed, but out of the box it is unusable.


It is also possible to configure the double-click behaviour on the title bar to zoom the windows without fullscreen them...


* poor window management

* no built-in package manager

* low hackability/customizability

* constant code signing prompts with 3rd party software, my OS is fighting me trying to use software

* constant prompts to please finally sign into icloud

* deprecated OpenGL

* general handholding in the OS getting in the way


> It's the primary OS for a huge number of developers (at least in the Bay Area) who are doing plenty of production work on it every day.

How much of that work is being done in Electron?


Can't wait indeed, they will be excellent BSD machines when all is said and done.


Mail Apple and ask for Vulkan support.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: