> But then also came the entitled users. This time, it wasn’t about stealing games, it was about features. “When is Thunderbolt coming?” “Asahi is useless to me until I can use monitors over USB-C” “The battery life sucks compared to macOS” (nobody ever complained when compared to x86 laptops…) “I can’t even check my CPU temperature” (yes, I seriously got that one).
This sounds so rough. I can't imagine pouring your heart out into this labor of love and continue to have to face something like this. Back in the early days of Quora, when it used to be good, there used to be a be nice be respectful policy (they might still have it), I wonder if something like that would be helpful for open source community engagement.
Regardless, major props to Marcan for doing the great work that he did, our community is lucky to have people like him!
First of all, I wholeheartedly applaud Marcan for carrying the project this far. They, both as individuals and as a team proper, did great things. What I can say is a rest is well deserved at this point, because he really poured his soul into this and worn himself down.
On the other hand, I'll need to say something, however not in bad faith. He needs to stop fighting with the winds he can't control. Users gonna be users, and people gonna be people. Everyone won't be happy, never ever. Even you integrate from applications to silicon level, not everyone is happy what Apple has accomplished technically. Even though Linux is making the world go on, we have seen friction now and then (tipping my hat to another thing he just went through), so he need to improve his soft skills.
Make no mistake, I'm not making this comment from high above. I was extremely bad at it, and I was bullied online and offline for a decade, and it didn't help to be on the right side of the argument, either. So, I understand how it feels and how he's heartbroken and fuming right now, and rightly so. However, humans are not an exact science, and learning to work together with people with strong technical chops is a literal superpower.
I wish Hector a speedy recovery, a good rest and a bright future. I want to finish with the opening page of Joel Spolsky's "Joel on Software":
For the last few years, I've been saying the following regularly (to friends, family and coworkers): communication is the hardest thing humans will ever do. Period.
Going to the moon, launching rockets, building that amazing app... the hardest thing of all is communicating with other people to get it done.
As a founder (for 40+ years and counting) I manage a lot of different type of people and communication failures are the largest common thread.
Humans have a very, very tough time assuming the point of view of another. That is the root of terrible communication, but assumptions are right up there as a big second.
On the Marcan thing... I just want to say, control what you can and forget the rest (yes, this is direct from stoicism). Users boldly asking for features and not being grateful? Just ignore them. Getting your ego wrapped up in these requests (because that's what it is, even if he doesn't want to admit it), is folly.
I contributed to Marcan for more than a year. I was sad to see the way it ended. I wish him well.
> Humans have a very, very tough time assuming the point of view of another. That is the root of terrible communication, but assumptions are right up there as a big second.
That's very true. I recommend some people to read "The Four Agreements", because that thin book has real potential to improve people's lives through active and passive communication.
Also worth being aware of Robert Kagen's adult development model [0] or something similar; that gives people a framework to go from "humans seem" to some actual percentages and capabilities.
Spoiler, but approximately 66% of the adult population make do without being able to maintain their own perspective independently of what their social circle tells them it is. I imagine that would make it extremely challenging to determine what someone else's perspective is. Especially if that perspective is being formed based on empiricism rather than social signalling.
And if we're making book recommendations, Non-Violent Communication is a gem of an idea.
That's pretty fascinating, thanks for sharing it! It's a pretty compelling explanation as to why some people seem to be completely unable to logically explain their reasoning for certain beliefs and just fall back to "well it should be so because everybody says so."
Ta. I learned about it from my favourite HN comment (https://news.ycombinator.com/item?id=40856578) and have spent the last 6 months wondering why people don't bring it up more. It may just be a model but it has much explanatory power for why there seem to be so many "stupid" people around. I don't really have the word to describe them; people who are technically reasonable but not convinced by arguments or evidence.
Marcus Aurelius wrote extensive personal reflections in his "Meditations". Seneca wrote detailed letters to friends and family discussing philosophy, life, and death. Epictetus discussed death extensively in his Discourses, but sure, they were philosophical teachings rather than personal goodbyes.
They focus on acceptance and equanimity rather than formal farewells.
That said, "control what you can and forget the rest" is indeed stoicism, albeit simplified.
If they've written "many many words over thousands of years" for the merits of their philosophy, they are also perfectly capable to write multi-paragraph goodbye letters. That's the bearing it has on the parents claim. And many did.
Why you felt the need to add your comment, is a more apt question.
> If they've written "many many words over thousands of years" for the merits of their philosophy, they are also perfectly capable to write multi-paragraph goodbye letters. That's the bearing it has on the parents claim. And many did.
Eh, not really - "multi-paragraph goodbye letters" here refers to the overly dramatic fad that internet denizens sometimes engage in when they leave communities, and they tend to have a lot of whining.
Those types of goodbye letters are not the types of goodbye letters stoics would write.
> Why you felt the need to add your comment, is a more apt question.
If you were able to pick up so swiftly what the person I replied to was implying, you too should be able to have picked up that I replied because I disagreed with that implication.
I doubt this, but would be curious to see a source.
> You could then just say that you disagree and state your case, without rudely asking why they posted it.
I didn't find it rude at all, and your reply was far less productive than my IMO neutral question. You took offense on behalf of someone else and inserted yourself when it was unnecessary and entirely reliant on your interpretation and perception. Now we're discussing your perceived slight instead of anything of substance.
> He needs to stop fighting with the winds he can't control. Users gonna be users, and people gonna be people. Everyone won't be happy, never ever.
Right - but it kinda sounds like he's facing headwinds in a lot of different directions.
Headwinds from Apple, who are indifferent to the project, stingy with documentation, and not inclined to reduce their own rate of change.
Headwinds from users, because of the stripped down experience.
Headwinds from the kernel team, who are in the unenviable situation of having to accept and maintain code they can't test for hardware they don't own; and who apparently have some sort of schism over rust support?
Be a heck of a lot easier if at least one of them was on your side.
> Headwinds from Apple, who are indifferent to the project, stingy with documentation, and not inclined to reduce their own rate of change.
That is part of the challenge he chose to take on.
> Headwinds from users, because of the stripped down experience.
Users can be ignored. How much you get users to you is your own choice.
> Headwinds from the kernel team, who are in the unenviable situation of having to accept and maintain code they can't test for hardware they don't own
You don't have to upstream. Again, it's not the kernel team that chose to add support for "hostile" hardware so don't try to make this their problem.
> and who apparently have some sort of schism over rust support?
Resistance when trying to push an entirely different language into an established project is entirely expected. The maintainers in question did not ask for people to add Rust to the kernel. They have no obligation to be welcoming to it.
> Be a heck of a lot easier if at least one of them was on your side.
Except for the users all the conflicts are the direct result from the choice of work. And the users are something you have to choose to listen to as well.
"Their boss" - I'm not sure that boss is best word here.
"did ask for it" - did he? Because from my perspective it looks more like he gave the bone for corporations so they will shut up for rust in kernel. After some time it will end up "Sorry but rust did not have enough support - maintainers left and there were issues with language - well back to C"
I don’t think that’s an accurate way to describe what happened, no. He seems to be enthusiastic about it and to genuinely want it to succeed.
> "A lot of people actually think we're somewhat too risk averse," said Torvalds. "So when it comes to Rust, it's been discussed for multiple years by now. It's getting to the point where real soon now, we will actually have it merged in the kernel. Maybe next release."…
> "Before the Rust people get all excited," the Linux kernel creator and chief said. "Right? You know who you are. To me, it's a trial run, right? We want to have [Rust's] memory safety. So there are real technical reasons why Rust is a good idea in the kernel…”
> “And hopefully, it works out, and people have been working on it a lot, so I really hope it works out…”
Last September he was still insisting he thinks the project will not fail, and he was not exactly subtle in his criticism of maintainers who refuse to engage with it in good faith.
> "Clearly, there are people who just don't like the notion of Rust, and having Rust encroach on their area.
> "People have even been talking about the Rust integration being a failure … We've been doing this for a couple of years now so it's way too early to even say that, but I also think that even if it were to become a failure – and I don't think it will – that's how you learn," he said.
> "So I see the whole Rust thing as positive, even if the arguments are not necessarily always [so]."…
> With impressive diplomacy, considering his outbursts of years past, Torvalds went on, "There's a lot of people who are used to the C model, and they don't necessarily like the differences... and that's ok.
But yeah, I still don't think it's all that inaccurate: He may not have wanted it to fail, and still not think it's a technical failure... But socially? Still seems possible he'd be starting to think that while the Rust language per se is a technical success, all the drama surrounding the integration of it into Linux means that that is turning out to be a social failure.
(Or maybe I'm just projecting because that is what it looks like to me.)
Another uphill battle that I haven't seen anyone mention is just how good mobile AMD chips got a year or so after the M1 release. I wouldn't buy a Mac to run Linux on it when I can buy a Lenovo with equally soldered parts that'll work well with the OS I wanna run already.
A lot of it is simply AMD getting on newer TSMC nodes. Most of the Apple's efficiency head start is better process (they got exclusive access to 5nm at first).
That's my understanding as well, as soon as the node exclusivity dropped they were ballpark equal.
Many ARM SOC are designed to run on battery only so the wireless packages and low power states are better, my AMD couldn't go below 400mhz.
But yeah the "Apple M hardware is miles and leagues away" hypetrain was just a hypetrain. Impressive and genuinely great but not revolutionary, at best incremental.
I hope to be able to run ARM on an unlocked laptop soon. I run a Chromebook as extra laptop with a MediaTek 520 chip and it's got 2 days battery life, AMD isn't quite there yet.
> But yeah the "Apple M hardware is miles and leagues away" hypetrain was just a hypetrain. Impressive and genuinely great but not revolutionary, at best incremental.
It's more nuanced than that. Apple effectively pulled a "Sony A7-III" move. Released something one generation ahead before everybody else, and disrupted everyone.
Sony called "A7-III" entry level mirrorless, but it had much more features even when compared to the higher-end SLRs of the era, and effectively pulled every other camera on the market one level down.
I don't think even they thought they'd keep that gap forever. I personally didn't think it either, but when it was released, it was leaps and bounds ahead, and forced other manufacturers to do the same to stay relevant.
They pulled everyone upwards, and now they continue their move. If not this, they also showed that computers can be miniaturized much more. Intel N100 and RaspberryPi/OrangePi 5 provides so much performance for daily tasks, so unimaginable things at that size are considered normal now.
I like the Sony story, but I don't think Apple did "pull everyone along" like that, they had an exclusivity deal with TSMC to be first on a good manufacturing node improvement. They took their high-quality, high-performance iPhone SoC, gave it more juice and a bit better thermals.
It's just another "Apple integrating well" story.
Their SoC is huge compared to competitors because Apple doesn't have to make a profit selling a SoC, they profit selling a device + services so they can splurge on the SoC, splurging on the SoC plus being one node ahead is just "being good", the team implementing Rosetta are the real wizards doing "revolutionary cool shit" if anything
> they had an exclusivity deal with TSMC to be first on a good manufacturing node improvement.
...plus, they have a whole CPU/GPU design company as a department inside Apple.
Not dissimilar to Sony:
Sony Imaging (camera division) designed a new sensor with the new capabilities of Sony Semiconductor (fab), and used their exclusivity to launch a new camera built on top of that new sensor. Plus, we shall not forget that Sony is an audiovisual integration powerhouse. They one of the very few companies which can design their DSPs, accompanying algorithms, software on top of it, and integrate to a single product they manufacture themselves. They're on par with Apple's integration chops, if not better (Sony can also horizontally integrate from Venice II to Bravia or Mics to Hi-Fi systems, incl. everything in between).
The gap also didn't survive in Sony's case (and that's good). Nikon and Fuji uses Sony's sensor fabs to use their capabilities and co-design sensors with the fab side.
Canon had to launch R series, upscale their sensor manufacturing chops. Just because Sony "integrated well" when looked from your perspective.
Sony is also not selling you the sensor. It's selling you the integrated package. From sensor to color accuracy to connectivity to reliability and service. A7-III has an integrated WiFi and FTP client to transfer photos. A9 adds an Ethernet jack for faster transfers. Again, integration within and between ecosystems.
>But yeah the "Apple M hardware is miles and leagues away" hypetrain was just a hypetrain. Impressive and genuinely great but not revolutionary, at best incremental.
Compared to the incremental changes we've seen the previous 10 years before it arrived on AMD/Intel space, it was revolutionary.
Was Intel switching from the Pentium 4 to the Core architecture considered revolutionary at the time? Was AMD's bulldozer architecture? I don't recall.
We must have different definitions of the word "revolutionary". They put a high-end mobile chip in a laptop and it came out good, what's revolutionary? The UMA architecture has advantages but hardly revolutionary.
The jump in performance, efficiency, battery time was not incremental or "evolutionary". Such jumps we call evolutionary.
What they did doesn't matter. Even if they merely took an intel laptop chip and stuck a chewing gum on it, the result was evolutionary.
So much so, that it put a fire under Intel's ass, and mobilized the whole industry to compete. For years after it came out the goal was to copy it and beat it.
What did you expect to call "revolutionary"? Some novel architecture that uses ternary logic? Quantum chips?
And some of these Lenovos are relatively upgradable too. I'm using a ThinkPad I bought refurbished (with a 2 year warranty) and upgraded myself to 40 GB of RAM and 1TB of SSD (there's another slot too if I need it). It cost me $350 including the part upgrades.
Prices seem to have risen a bit since I bought mine. Here's a similar model with a Ryzen 5 7530U for $355: https://www.ebay.com/itm/156626070024 It is certified refurbished and has a two year warranty. It has a SODIMM slot and supports dual SSDs, although not the full size M.2.
It's not just that "people are hard" - it was clear that this will end up this way the moment marcan started ranting on social media about having to send kernel patches via e-mails. Collaborating on software development is a social activity and stuff like convincing maintainers to trust you and your approach is just as important part of it (if not more important) as writing code. Not realizing that is a sure road to burnout (and yes, I'm just as guilty of that myself).
> Not realizing that is a sure road to burnout (and yes, I'm just as guilty of that myself).
Humans are shaped by experience. This is both a boon and a curse. I have been also been on the hot end of the stick and burned myself down, sometimes rightly, sometimes wrongly. Understanding that I don't want to go through this anymore was the point I started to change.
> Collaborating on software development is a social activity and stuff like convincing maintainers to trust you and your approach is just as important part of it (if not more important) as writing code.
Writing the code is at most 5% of software development IME. This is what I always say to people I work with. I absolutely love writing code, but there are so many and more important activities around that, I can't just ignore them and churn out code.
> Writing the code is at most 5% of software development IME.
This really depends on what you work on. And how good the managers are on your team. I talked to a manager at Google once about how he saw his job. He said he saw his entire job as getting all of that stuff out of the way of his team. His job was to handle the BS so his team could spend their time getting work done.
This has been my experience in small projects and in very well run projects. And in immature projects - where bugs are cheap and there’s no code review. In places like that, I’m programming more like 60% of the time. I love that.
But Linux will never be like that ever again. Each line of committed code matters too much, to too many people. Is has to be hard to commit bad code to Linux. And that means you’ve gotta do a lot of talking to justify your code.
I did some work at the IETF a few years ago. It’s just the same there - specs that seem right on day 1 take years to become standards. Look at http2. But then, when that work is done, we have a standard.
As the old saying goes, if you want to go fast, go alone. If you want to go far, go together. Personally I like going fast. But I respect the hell out of people who work on projects like Linux and chrome. They let us go far.
Even in the Google example, it's still in the low percentages when you view it as a system. All the manager did was efficiently allocate resources. It didn't reduce the non-programming work - it simply moved it elsewhere.
Someone who is in a management position, has good political skills and good connections will be way more efficient at doing some of this non-programming work.
This is something that even C-levels forget. Something that takes a CTO 2 minutes to do can take several months for a regular developer to achieve, and I have plenty of experience on and plenty of examples of that.
Yeah. I think the whole drama around rust on Linux is a great example of this. If Linus came forward and clearly supported (or clearly rejected) rust on Linux, it would have saved a lot of people months of stress and heartache.
It really depends what kind of code and for which usage.
People might also live their hobby dev experience better if they were really coding for themselves without any expectation except pushing the code to a repo. As a hobby dev, you don't have to make package, you don't have to have an issue tracker, you don't have to accept external contributions, you don't have to support your users if you aren't willing to have this on your shoulder. You don't even need a public git repo, you could just put a release tarball when release is ready on your personal website.
This works perfectly fine as long as you're happy with being approximately the only user of your code. With some of my projects I do just that, but it gets very different once you add users to the mix.
5%? Sure there is a lot of activity around software. But out of week of 40 hours I most certainly code more than at most 2 hours. If this is your workplace I think it's dysfunctional.
You are implying that if you can communicate but have nothing backing it up that's worth 95%? If anything code can still be taken as is and understood by someone else. So to me it's always most important to be able to produce anything before being able to communicate.
In a sense, yes. I'm contributing to a small but crucial part of a big project, as a coordinator of a four person team. The dynamics of the project form the team as "band of equals", in other words, everybody has approximately the same capabilities, and roles are divided organically, and I ended up as the "project manager" for that group somehow.
Even before we started coding, there was an RFC written by us. We have talked about it, discussed it, ironed it out with the chief architects of the project. When everything made sense we started implementing it. Total coding hours is irrelevant, but it's small when compared all the planning and it's almost finished now.
The code needs to tap and fit into a specific place in the pipeline. Finding and communicating this place was crucial. The code is not. Because you can write the most sophisticated code in the most elegant way, but if you don't design and implement it to fit to the correct place, that code is toast, and the effort is a waste.
So yes, code might be the most enjoyable (and sometimes voluminous) part, but it's 5% of the job, by weight, at most.
Once your proof of concept gains traction more time is spent in meetings with other teams responsible for the systems you'll be interacting with - making sure you do it "right" rather than scrappy. Once your initial release starts getting popular you spend more time looking at metrics and talking to operations folks to make scaling easier and not waste resources. Once your product starts having customers who depend on it, you spend a lot of time working with product to figure out features that make sense, advise on time estimates, mentor new team members, and advise teams who use your product/services as a dependency.
These are all engineering tasks, and the longer you spend on a team/in a company, the more likely it is you provide more value by doing this than by slinging code. You become a repository of institutional knowledge to dispense.
Think about it the other way around: How much code is written and never used? How much code is written and would be better if it were never used? How much code is used only to then notice, that it doesn't solve the business problem that it was intended to solve? How much code is run and it's never noticed that it doesn't solve any business problem?
All the while: You are correct, being able to produce anything that solves a problem is much more valuable than being able to talk about it. But in order to unlock the value (beyond solving your own problem) absolutely requires communication
It's more like writing the code is just the first step on a long road. You won't go anywhere at all if you don't take it, but if that's the only thing you do, all you did is the first step.
I have written plenty of code that's stuck on this first step in my life, including some that went to the very same LKML we're talking about here right now. Some of those things have already been independently written again by other people who actually managed to go further than that.
Perhaps "useless" was the wrong word the GP used. "valued" may be better.
It's fairly common for very useful/valuable code to be discarded because the engineer (or his management) failed to articulate that value to senior leaders as well as someone else who had inferior code.
> it was clear that this will end up this way the moment marcan started ranting on social media about having to send kernel patches via e-mails. Collaborating on software development is a social activity and stuff like convincing maintainers to trust you and your approach is just as important part of it (if not more important) as writing code.
Yeah but FFS using email for patches when there are so much better ways of doing development with git? The Linux Foundation could selfhost a fucking GitLab instance and even in the event of GitLab going down the route of enshittification or closed-source they could reasonably take over the maintenance of a fork.
I get that the Linux folks want to stay on email to gatekeep themselves from, let's be clear, utter morons who spam on any Github PR/issue they can find. But at the same time it makes finding new people to replace those who will literally die out in the next decade or two so much harder.
It's an interesting phenomenon where people keep coming out of the woodwork and criticize the most successful software development project inhistory for doing it wrong.
They're not micro kernel! They're not TDD! They're not C++! They're not CVS! Not SVN! Not SCRUM! Not Gitlab!
Yet the project marches on, with a nebulous vision of doing a really useful kernel for everyone. Had they latched on any of the armchain expert criticism of how they're doing it wrong all these years we wouldn't be here.
> Yet the project marches on, with a nebulous vision of doing a really useful kernel for everyone.
The question is - how long will it march on? The lack of new developers for Linux has been a consistent topic for years now. Linus himself isn't getting younger, and the same goes for Greg KH, Ted Ts'o and other influential leads.
When the status quo scares off too many potential newcomers, eventually the project will either wither or too inexperienced people drive the project against a wall.
The people in charge decided on their preferred ways of communication. You may believe that there are better ways out there, and I may even agree with you, but ultimately it's completely irrelevant. People responsible decided that this is what works for them and, to be honest, they don't even owe you an explanation. You're being asked to collaborate in this specific way and if you're unable to do it, it's on you. If you want to change it, work your way to become a person who decides on this stuff in the project, or convince the people already responsible. Notice how neither of those are technical tasks and that they don't depend on technical superiority of your proposed methods either.
> Yeah but FFS using email for patches when there are so much better ways of doing development with git?
You are missing one point, namely that email is probably the only communication medium that's truly decentralized. I mean, on most email providers you can export your mailboxes and go to someone else. You can have a variety of email clients and ways to back up your mailboxes. No git clone, no specific mailbox or server is in any way special, I think Linus emphasized recently that they made efforts to ensure kernel.org itself is not special in any way.
Yes, I find Github's or Gitlab's UI, even with all enshittification by Microsoft and whatnot, better for doing code reviews than sight-reading patches in emails. And yet I cannot unsee a potential danger that choosing a service — any service! — to host kernel development would make it The Service, and make any migration way harder to do than what you have with email. Knowing life, I'd say pretty confidently that an outcome would be that there would be both mailing lists and The Service, both mandatory, with both sides grumbling about undue burdens.
Have you ever been in a project which had to migrate from, say, Atlassian's stack to Github, or from Github to Gitlab, or vice versa? Heck, from SourceForge + CVS/SVN to Github or similar? Those were usually grand endeavors for projects of medium size and up. Migrate all users, all issues, all PRs, all labels, test it all, and you still have to write code while it all is happening. Lots of back-and-forth about preserving some information which resists migration and deciding whether to just let it burn or spend time massaging it into a way the new system will accept it. Burnout pretty much guaranteed, even if everyone is cooperating and there is necessity.
But you could probably build tools on top of email to make your work more pleasant. The whippersnappers who like newer ways might like to run them.
I personally don't think GitHub's PR model is superior to e-mail based patch management for two reasons. First, e-mail needs no additional middleware at git level to process (I can get my mails and directly start working on my machine), plus e-mail is at least one of Git's native patch management mechanisms.
This is not about spam, server management or GitLab/Gitea/whatever issue. This is catering to most diverse work methods, and removing bottlenecks and failure points from the pipeline. GitLab is down, everybody is blocked. Your mail provider is failing? It'll be up in 5 minutes tops, or your disk is full probably, go handle it yourself.
So Occam's razor outlaws all the complex explanations for mail based patch management. The answer is concise in my head:
> Mailing list is a great archive, it's infinitely simpler and way more robust than a single server, and keeps things neatly decentralized, and as designed.
This is a wind we can't control, I for one, am not looking and kernel devs and say "What a bunch of laggard luddites. They still use e-mail for patch management". On the contrary, I applaud them for making this run for this many years, this smoothly. Also, is it something different what I'm used to? Great! I'll learn something new. It's always good to learn something new.
Because, at the end of the day, all complex systems evolve from much simpler ones, over time. The opposite is impossible.
> Your mail provider is failing? It'll be up in 5 minutes tops, or your disk is full probably, go handle it yourself.
Well until you deal with email deliverability issues, which are staggeringly widespread and random. Email were great to send quick patches between friends like you'd exchange a USB key for a group project. For a project the size of Linux? It doesn't scale at all. There is a reason why Google, Meta, Red Hat, and [insert any tech company here] doesn't collaborate by sending patches via email.
the problem with mail-based patch management is that it doesn't scale well, management wise... when you have hundreds of patches and multiple reviewers who can review them, Github/Gitlab environments make it easier to prioritize the patches, assign who will do the review, filter the patches based on tags, and keep track of what wasn't reviewed yet...,
mail-based patch management is fine for smaller projects, but Linux kernel is too big by now.. it sure is amazing how they seem to make it work despite their scale, but it's kinda obvious by now, that some patches can go unnoticed, unprioritized, unassigned, ...
and open source is all about getting as many developers as possible to contribute to the development. if I contribute something and wait months to get it reviewed, it will deter me from contributing anything more, and I don't care what's the reason behind it. the same goes for if I contribute something and receive an argument between two or more reviewers whether it's the right direction or not and there's no argumentative answer from a supervisor of the project and this situation goes on for months...
Email is just the protocol. What you're really saying is that http-based protocols make more powerful tools possible.
It's not really enough to state your case. You have to do the work.
On the surface, the kernel developers are productive enough. Feel free to do shadow work for a maintainer and keep your patch stack in Gitlab. It it can be shown the be more effective, lots of maintainers are going to be interested. It's not like they all work the same way!
They just have a least common denominator which is store-and-forward patch transport in standard git email format.
Everyone still has at least the base branch they're working on and their working branch on their machine, that's the beauty of working with Git. Even if someone decides to pull a ragequit and perma-wipe the server, when all the developers push their branches, the work is restored. And issues can be backed up.
> Also, is it something different what I'm used to? Great! I'll learn something new.
The thing is, it's harder and more difficult in a time that better solutions exist. Routinely, kernel developers complain about being overworked and onboarding of new developers to be lacking... one part of the cause certainly is that the Linux kernel is a massive piece of technology, and another one that the social conventions of the Linux kernel are very difficult, but the tooling is also very important - Ballmer had a point with "developers developers developers".
People work with highly modern tools in their day jobs, and then they see the state of Linux kernel tooling, and they say "WTF I'm not putting up with that if I'm not getting paid for it".
Or to use a better comparison... everyone is driving on the highway in the same speed, but one car decides to slow down, so everyone else overtakes it. The perpetual difficulties of many open source projects to accomodate changing times and trends - partially because a lot of small FOSS is written by people for their individual usage! - are IMHO one of the reasons why there is so much chaos in the FOSS world and many private users rather go for the commercial option.
You are missing the entire point. When you interact with a group of people who already have a culture and a set of practices/traditions, you have to play by their rules, build up credibility with that community... and then maybe, down the road, you can nudge them a little to make changes. But you have to have credibility first, have established that you understand what they do and understand why their preferences are the way they are.
If you approach it from the viewpoint that you have the solution and they are Luddites, you will influence no one and have no effect.
Marcan's career as a developer includes lots of development on hostile systems where he's jailbreaking various consoles to allow homebrew.
Asahi Linux is similar, given how hostile and undocumented Apple Silicon is, but it has a great amount of expectations of feature completeness and additional bureaucracy for code changes that really destroys the free-wheeling hacker spirit.
I understand. While I'm not as prolific as him, I've grown with systems which retrocomputing fans meticulously restore and use, so I had to do tons of free-wheeling peeking and poking.
What I found is being able have this "afterburner mode" alongside "advanced communications" capabilities gives the real edge in real life. So, this is why I wish he can build his soft skills.
These skills occupy different slots. You don't have to sacrifice one for the other.
Probably a few reasons. For Darwin, there are a few small projects but I think they are all functionally dead. The benefit with Linux, or even the BSDs here is, sure you gotta port to the hardware, but you should get a good set of user land stuff running for 'free' after that. Lots of programs just need to be compiled to target arm64 and they will at the very minimum function a little bit. Then you can have package maintainers help improve that. I don't think any of the open source Darwin based projects got far enough to build anything in user land. So you'd probably just get the Darwin code from Apple, figure out how to build it and then build everything else on top of it.
The BSDs. You can fork a BSD. Maybe he could try to mainline into the BSD, but would probably face a similar battle with the BSDs. Right, one again, the benefit mainlining into linux, and there is some (maybe limited) support to include Rust, is you can narrow your scope. You don't need to worry as much about some things because they will just sorta work, I am thinking like upper layers of the kernel. You have a CPU scheduler and some subsystems that, may not be as optimized for the hardware, but at least it is something and you can focus on other things before coming around to the CPU scheduler. You can fork a BSD, but most would probably consider it a hard fork. I also don't think any of the BSDs have developers who are that interested in brining in Rust. Some people have mentioned it, but as far as I know, nothing is in the works to mainline any kind of Rust support in the BSD kernels. So he would probably meet similar resistance if he tried to work with FreeBSD. OpenBSD isn't really open to Rust at all.
Why insist on developing in Rust? I mean, I see how it's much cooler and actually better than something like C, but people are hugely underestimating how difficult it is to change the established language of a 3 decade old project.
If Rust is the point you get up from the bed in the morning, why don't you focus on Redox and make it the new Linux? Redox today is much more than Linux was in 1991 so it's not like you would be starting from scratch.
You're probably not as good as Linus in, well, anything related to this field really. The only way to find out whether you actually are is to do the work. Note that also he spent a lot of time whining to people who were perceived as the powerful in the field. But in addition to whining he went and did the work and proved those people wrong.
Mind you, I'm a PHP developer by day, so this Rust-vs-C debate and memory management stuff is not something I've had experience with personally, but the "Rust is magical" section towards the bottom seems like a good summary of why the developer chose to use Rust.
Oh no, I totally agree. I am just saying from the perspective of the Asahi Linux project and wanting to use as much Rust as they can, that is what they are facing and the associated trade offs.
I personally fall a little more on the side of the Linux kernel C devs. Inter-oping languages and such does bring in a lot of complications. And the burden is on the Rust devs to prove it out over the long haul. And yes, that is an uphill battle, and it isn't the first time. Tons of organizations go through these pains. Being someone who works in a .NET shop, transitioning from .NET Framework to .NET core slowly is an uphill battle. And that's technically not even a language change!
But I do agree, Redox would probably less friction and a better route if you want to get into OS dev on an already existing project and be able to go "balls to the walls" with Rust. But you also run into, Redox just has a lot less of everything. That is just because it's a small project.
> Asahi Linux is similar, given how hostile and undocumented Apple Silicon is, […]
«Undocumented» – yes, but «hostile» is an emotionally charged term that elicits a strong negative reaction; more significantly, though, it constitutes a flagrant misrepresentation of the veritable truth as stipulated within the resignation letter itself:
When Apple released the M1, I realized that making it run Linux was my dream project. The technical challenges were the same as my console homebrew projects of the past (in fact, much bigger), but this time, the platform was already open - there was no need for a jailbreak, and no drama and entitled users who want to pirate software to worry about.
Which is consistent with marcan's multiple previous blog posts and comments on here. Porting Linux (as well as NetBSD, OpenBSD) onto Apple Silicon has been no different from porting Linux/*BSD onto SPARC, MIPS, HP-PA and other platforms.
Also, if you had a chance to reverse-engineer a closed source system, you would have known that «hostile» has a very specific meaning in such a context as it refers to a system that has been designed to resist the reverse-engineering attempts. No such resistance has been observed on the Apple Silion computing contraptions.
> No such resistance has been observed on the Apple Silion computing contraptions.
I think they even left a "direct boot from image" (or something similar) mode as a small door to allow Asahi Linux development, if not to accelerate a little bit without affecting their own roadmap. Even Hector tweeted about it himself!
I also think calling it hostile is a little far. I recall Hector making comments of, "yea, even though is not greatly documented, it does things quiet a few things the way I would expect" and I believe even applauded Apple on a few thing. I wanna recall it was specifically around the booting.
Yeah, I want to give them accolades for the great work they did.
I just wanted to also add that users will be users. Once its out, there will be endless posts about "why X" and "why not Y". No matter what you do, lots of people are going to be displeased. Its just the way things go. I hope he will want to pick it up again after some time.
This is every successful product, small, medium, large. I've never ever worked on a big corporate or small personal project and not experienced this.
The secret is to have a healthy system for taking in those requests, queueing them by priority, and saying, "you are 117 in the queue, you can make it faster by contributing or by explaining why its higher priority".
You can't let feature requests get to you, the moment you do your users become your opponent. None of those requests are entitled, the author has clearly already reached a point where they are antagonistic towards requests.
I always tell this story about working with sales at a job where I worked in tech support. Sales would call me up and ask why I hadn't talked to their client about their very important ticket.
I would tell them:
"I have 5 P1 tickets, 8 P2 tickets, and dozens of P3 tickets. Your ticket is a P3 ticket."
They would ask that I change it to a P1. I would. Then they would call me an hour later asking me about the ticket and I would tell them:
That's when they understand that they have to start fighting their peers and talking with the big boss to get their P1 ticket moved in front of the other P1 tickets.
Yup, but it gets them out of my hair, and they understand the support guy isn't in a position to wave a magic wand for them. If sales guy wins his fight with the folks in charge and I get time / resources to work on his thing, fine with me.
At Symbian defects were classified from P1-P4, with the inevitable shit-fights about adding magic runes to the title so everyone knows that your P1 is more important than theirs.
The day came when, after prolonged hand wringing and with stern observations about great power and great responsibility, the priority could be set to P0. But like any bunch of junkies we came off this new high all too quickly and the P-1 classification arrived, the showstopper of showstoppers.
In hindsight what I most regret is that we stuck with an integer field; we were denied the expressive power of fractionally critical issues.
So? If they succeed (big if), then that ticket is your new priority. Maybe even for good reason, maybe not. But usually you don’t care that much which one you work on first, do you?
Good for them for at least understanding at that point. The typical response is to say "I get that, I really do—can you move this one to the front of the line for me?" and then maybe a vague threat like "I can talk to your manager if it would help".
In my experience, when it's other people deciding the priority of your tasks (usually your boss), the distribution is 150 P1 tickets, 3 P2 tickets and 1 P3.
This is when the underrated skill of saying NO pays off massive dividends. One long-term client once told me the thing he appreciated the most, compared to most other consultants, was that I wasn't afraid of pushing back on his requests and saying no (within reason). Probably the most valuable feedback I have ever received.
When I worked support, they didn't even have a priority system (it was C2B, so there weren't necessarily enterprise customers. That did come later, with LiveChat and all it's joys). Instead, we had a 24hr expected turnaround and harder tickets would naturally filter to the top. Tickets that had reached near that point had a higher weight, which went towards your metrics/"leaderboard status". To dissuade gaming of the system, ongoing replies were assigned to an agent (you wouldn't give a half-assed reply and then hope for someone else to clean it up) and were exempted from the bonuses (so were one standard ["fresh"] ticket each).
Obviously, there was some oversight from managers, but overall it worked pretty well.
Yes, this is pretty normal; in paid products I even find it's less aggressive than in free things. But I have a hard and frozen shell around my vital organs to just politely and friendly point to the place in queue and where to donate to speed it up. For $10k I will build your cpu temp proc, if that's not an option then it's in pos #17463 of my task list.
Yes. I was developing some open source stuff before venturing to for-profit closed Source Software, and I was surprised that the paying customers were on average much nicer than those who got their stuff for free!
When you pay for something, you’ve already demonstrated that you value whatever it is (a product, a service, etc). Free stuff tends to attract people who don’t value the thing.
Or the more darwinistic view: anything you pay access for, you can get gated off from.
Its quite difficult to ban someone from a public park, especially when they can just put on a new hat.
Its really easy to ban someone from a private park. Even if they do put on a new hat, when they get belligerent again you just revoke the renewal of their access pass.
There's also a level of professionalism depending on the product. When I'm responsible for an MSP team I'm very polite to them and always try to get them good, detailed, high-quality information when I'm telling them about problems with their work product, because I want them to do good work quickly and that's the best way to do that.
Yea, I'm not sure it's open-source vs other software. It's public vs. professional insiders.
My company's bug tracker is mostly internally-filed bugs, but accepts bugs from the public. The difference in tone and attitude is night and day. The public-filed bugs can be wild, varying across: Rude, entitled, arrogant, demanding, incoherent, insulting, irrelevant, impatient... They are also the worst when it comes to actually including enough information to investigate. Frequently filed without logs, without reproduction steps, sometimes without even saying what the filer thinks is wrong. We get bugs with titles "It doesn't work" and with a text description that reads like a fever dream from someone very unwell.
We do have strong personalities among employees, but bug reports tend to be professionally and competently written, contain enough information to debug, and always, always leave out insults and personal attacks. The general public (at least many of the ones technical enough to file bug reports) does not seem to have the emotional regulation required to communicate professionally and respectfully.
> Frequently filed without logs, without reproduction steps, sometimes without even saying what the filer thinks is wrong.
In projects where this is a problem, I've made an issue template that clearly requests all the stuff I think I'll need. There's a big note at the top of the template that says it's not optional and that if it isn't filled out fully, I'll close the issue without comment.
And then I do that, every time. Sometimes they fill it out and reopen, sometimes they don't. Either way, I don't end up wasting time trying to help people who don't respect my time.
I tell my friends all the time: You want your product to be accessible? Sell cheap, but not too cheap.
Fair deals attract people with some money, but the almost-free only attract people who are forever broke, who live their life feeling entitled to everything being handed over to them.
> were on average much nicer than those who got their stuff for free!
this is always true with, at least a great many, people. it's related to choosey-beggar syndrome. it's a bug/glitch/feature in human psychology.
if you ever have the chance to be a property manager, never ever let someone move in a week early or pay a week later for free. never let your rent get drastically below market. when people aren't paying for something, it's incredibly common behavior to stop respecting it. it's like a switch flips and suddenly they are doing you the favor.
that's why in times past, offering or taking "charity" was considered impolite. but making a small excuse might be ok. say someone needs to stay an extra week after their lease was over, but was strapped for cash. instead of saying "sure you can stay one more week", say "well, you'd really be doing me a small favor staying in the place to watch for the extra week since it's empty anyway. how about i discount the rent by 50% for that week and amend the lease to take care of it."
I agree that this is needed. It doesn't stop the person requesting the feature from asking for a meeting to explain why and just whining that they need it the whole time and saying they shouldn't have pay anything to get it addressed right now.
Having in the person taking these meetings for a software vendor, it can get really toxic quickly and I never had more than 1 meeting a quarter with really toxic people and they were at least paying for the product and maintenance so hearing them out was part of the job. It unfortunate to get to the point where you view customer requests as antagonistic, but I can see how it happens. Some people really feel entitled, and some have a job to do and limited resources or control to do it in.
Yep. I've been working on Ardour for 25 years now, and it took me 7-10 years to develop the right kind of skin for dealing with "user feedback". For me, the right kind of skin was basically to shed such stuff like water off a duck's back. Whether someone is saying "I've been using Logic for 10 years and this is so much easier and intuitive" or "You should be ashamed for asking anyone to pay for this steaming pile of shit" (both real quotes), I had to be able to shrug and carry on with whatever my development priorities were anyway.
That said, I sympathize very much with Marcan on this project: getting the basic infrastructure for Linux operational on new hardware inflames passions much more than a niche project like a DAW.
Thank you for Ardour btw, great piece of software although I still use Ableton from time to time, Ardour is taking over more and more parts for me :)
I've read your comments here (and elsewhere) for a long time, and I'm sure you'd have some great ideas or at least opinions about this, which is pretty relevant to what you just wrote: https://news.ycombinator.com/item?id=43037537
I think my experience actually making a living from a FLOSS project changes things enough that it is not that relevant to people doing it "for the love it" or as a side-project.
It's much easier to shrug off strong comments when the people who do support you are making it possible for you (and one other) to lead a pretty comfortable middle class life.
God how I hate these arguments. You have this especially with Gimp. "But my beloved multibillion company worth product can do X sooo much better and easier. Also it has 16bit bla bla bla." You don't say?!?!?!?
Any tips how to get a thicker skin, or it grows on you over the years ?
Also, thanks for Ardour. I am a hobby cellist and record sometimes myself using Ardour and to cut down samples for an app I am working on. I tried doing that with my iPhone which worked like crap. Yup!
Open source is about liberating computing not about liberating users.
If you're supporting end users you need to be collecting money from them.
The mechanics of this system are entirely upside down. The corporations have bought into open source to regain control of computing and passionate developers are mired in the swamp of dumb user requests.
I think entitlement like that is stupid and bad for open source (and everything). However, in the next paragraph the author gets into criticising the opposite position, that asahi linux was not ready for everyday use. The entitled requests came from users that thought of asahi linux as exactly covering an everyday use case, a linux distro they should be able to use to carry on their tasks. This I find contradictory. While some entitled users always exist, you can either admit that asahi is not a daily driver for people who want to use most of basic features of a laptop, or admit that the requests make sense. You cannot both claim that asahi is fine to be used, and complain that users ask for being able to connect an external monitor on a M1 macbook air. I am not sure what is wrong with the claim that asahi linux is an experimental (and no less amazing) project that people lacks certain (widely considered basic when things come to this) functionality, or that the functionality of it is restricted to these use cases that may include using it as a headless server but exclude some common other ones. I am not sure how this would matter, but setting user expectations to a level that matched the state of development may have helped to limit such requests.
I say that also because I have been gotten quite a few responses from people that I should use asahi, while looking at what it supports it definitely would not make sense for me, and you cannot just present it to a macos alternative right now.
Thing is it will never get to be a daily driver if people don't use it and shake out the bugs.
25 years ago (huh, long time), when Windows ME pissed me off for good, linux wasn't exactly known for being a daily driver but I gave it a try and, unsurprisingly, it did become reliable over the years. Other than Gnome's propensity to make stupid changes to default settings I can't remember the last time I had to even think about messing with the underlying system and other than a simple google search on the linux compatibility of hardware before I buy I just don't think about it. Actually, I take that back, when I first got my current laptop I was messing around to get the AMD mesa drivers (or whatever) working because I wanted to mess around with this fancy GPGPU thing.
Personally, if I were to buy a macbook it would be for the OS and not dodgy linux support because I've walked that road before. If the Christmas sales were just a tiny bit better though...
I am talking about lack of pretty standard features, not bugs. Having more users would not help there. And in general, you dont need a huge influx of users, and you definitely you dont get as much help from users who are not going to put at least some effort in the feedback they give. You want users that are conscious enough about what they are using to give useful feedback and/or support with donations. I am pretty sure that some people still are attracted to running an experimental version of linux.
Imo modern linux experience is much better than the situation you describe, at least as long as you use certain type of hardware. In the past it was definitely harder. But wrt asahi, I want the "luxury" of using an external monitor with my 13" macbook air, and sadly, while in the past x86 machines I put linux I would put some effort and get AMD mesa drivers to work, I cannot do that here. I respect the effort put in the asahi project, but calling it suitable for a daily driver is misleading, unless you specify exactly what sort of daily driver you mean. Stuff like using an external monitor is pretty basic in my book of daily usage.
I work for a company that is open source and has a large community. I blows my mind (and often aggravates me) how rude some people can be.
For some reason people feel that it is appropriate to throw barbs in their issue reports. Please to everyone out there, if you find an issue and want to report it (hurray open source!) please be kind with your words. There are real people on the other side of the issue.
Always remember, you catch more flies with honey than vinegar.
> I blows my mind (and often aggravates me) how rude some people can be.
That seems to be a general characteristic. I strive to be cheerful and helpful whenever I'm asking for something. I feel like (sadly) it sets me apart from the crowd and helps me to get what I'm asking for. And IAC, with so little effort on my part I may brighten someone else' day and that makes me happy.
Just last week I asked housekeeping at a hotel for an old style coffee pot since I had brought my own coffee and filters. I started with "Can I pester you a moment?" and the conversation went up from there. Housekeeping was extremely friendly and helpful. Later I guessed this might have been her way to disarm some of the typical hostile interchanges she's been the brunt of.
I always feel like I'm imposing, and I have to remind myself that there are people who are eager to hear what I have to say. I try to set up my issue reports with appropriate background, and I always volunteer to, for example, submit a PR for a documentation change if the resolution requires it. And I have had some of the most wonderful interactions with complete strangers who had an idea, built a tool for themselves, and found other people had the same need.
There's a broader topic of ... just be nice to people. It doesn't cost anything. It does reassure me that this universe has been struggling with this for decades upon decades--witness the Malvin and Jim scene in WarGames. "Remember when you told me to tell you when you were acting rudely and insensitively?"
I think I kind of get it. By the time someone actually gets to the point of filing an issue report, they are at the end of their rope. They have tried everything they can think of. They have googled and found no one else having the same problem, or fixes that don't work, or people saying "why would anyone need that feature". They feel like they're being gaslit, their time is being wasted, and that the developers are intentionally antagonizing them. And then the form to submit the issue has way too many fields and comes across as very adversarial.
That's certainly how I felt when trying to get my drawing tablet to work properly under Linux Mint, although in my case I skipped filing an issue and just gave up and went back to Windows.
It sounds like he really got invested too much into what people wrote.
> “Asahi is useless to me until I can use monitors over USB-C” “The battery life sucks compared to macOS”
These are not even requests. These are objective statements he can either take note of for prioritisation or ignore. I can also say Asahi is useless to me until usb-c monitors support, but that's just my situation - there's no bad faith or request here. Previously that was the same for WiFi support.
I wish there was some good model for maintainers of bigger projects to deal with this on a personal level. The bigger the project, the more people there will be with unmet requirements and that's just life. It literally can't be solved.
I had a similar thought. The tone of the messages was a little rough and they definitely could have used some better tact knowing that the project developers would see it, but ultimately those are just factual statements delivered with brutal bluntness.
Right exactly. They're not tactful, but they also weren't in bad faith. Marcan should have taken 5 minutes to realize that he's the boss, he's the one doing the work, nobody is entitled to anything in free software and that if people want a feature sooner they can either fund the project or kick rocks. Anyone who's been in open source for over 2, definitely 3 understands this.
> I miss having free time where I can relax and not worry about the features we haven’t shipped yet. I miss making music. I miss attending jam sessions. I miss going out for dinner with my friends and family and not having to worry about how much we haven’t upstreamed. I miss being able to sit down and play a game or watch a movie without feeling guilty.
This is the big problem really. He should have just turned down his work hours to a regular 40 a week, asked for more donations to pay more people and asked for more volunteer help. And honestly, probably therapy.
This is the same person that resigned as a kernel maintainer (focus on Apple/Arm unsurprisingly) about a week ago.
I don't know this person so this is completely baseless speculation but I assume they are "going through it" in some way and experiencing significant burnout, which based on my own experience in the past has a way of (negatively) amplifying all sorts of interactions that are related to the source of your burnout.
Basically, making linux work on Apple hardware is a pretty hard task, including a shitload of reverse engineering.
When a user decides to try it, and finds a lot of features missing, they are completely unaware of the work required to get it into that state, and just think they should have the readily available features.
> This sounds so rough. I can't imagine pouring your heart out into this labor of love and continue to have to face something like this.
Or: he shouldn't steal people's time with false advertising :shrug:
Also if he wants to create an operating system, then these aren't even requests, but
bug reports. So the users ate his false advertising, spent time to try out his system, then spent some more time to file bug reports, and then he calls them "entitled users".
I can't imagine then what's his problem. I don't get offended by people that can't even read. I don't normally call them people let alone entitled :\ Set up a bot that links them the device support page, and problem solved? I don't get it
> It's comments like these that causes people to wear out.
No it isn't. You - fundamentally - don't get to control what people say to you. You need to filter how to take that. And that's incredibly hard. Especially in open source. You need to both be able to ignore (some version of "idiots, who can't be bothered to read") and be openminded enough to take weird requests, because they could be the starting point of a new major contributor. The second is optional, as long as you are happy just doing your thing, but then the former probably won't become a problem for you.
I'm know it's pretty pointless to argue because we see the world in a different way. But realize the (quoted) requirements are you putting on the open source developer.
I'd argue I'm not putting any requirement on the developer, I'd argue I'm making a statement of fact. Namely
> A developer without these skills will burn out.
And I think that's something that should be said more directly. If you want to do open source (as in become the provider of load bearing infrastructure): Then you really need to realise what you are getting yourself into. Would I like that to be different? Sure. Would I bet on that changing? Absolutely not.
And yes, that absolutely means you can either do open source as a hobby, then nobody should ever be willing to rely on the thing you are building (because you can just say "i've got better things to do than fixing the security bug you got") or you can attempt to get other people to use and rely on it, but then you have to find a way not to burn out.
Open source attracts some of the very worst users. Often people pretending to be trying to help by "suggesting improvements", but just as often entitled people who want to work for free. I don't think policies will change that. It's just something you have to accept when you provide something useful to lots of people for free. Even if you use moderated environments for user feedback (adding the burden of constantly banning people), people will find your email address and complain to you directly. See also: jwz/xscreensaver/Debian drama. Seeing how people treat open source developers makes me hesitant to upload any code I write to a public repository.
I'd expect the worst part for an Asahi project contributor to be the active sabotage some angry Linux kernel devs are trying to pull because they don't like Rust. Users being unreasonable is one thing, but your fellow maintainers are supposed to be allies at least.
I hope Marcan can find a new project to take on that doesn't involve all of this mess.
> Open source attracts some of the very worst users
I don't think it's even just that, it seems to be something about the price.
I work on a piece of closed-source free software, and we consistently get support requests from unbelievably entitled assholes. The worst of them are the ones that have some technical knowledge; they will not only demand things be fixed or implemented, they make completely erroneous statements about how easy it would be to fix/implement with the conviction that they are 100% correct, with a level of arrogance that is impossible to fathom how they could have written their email with a straight face.
The support requests we receive for a paid offering from the same company are 99% of time much more pleasant people (of course there are the, "I PAID FOR THIS YOU MUST FIX IT!!!1!" on occasions, but they're a definite minority).
I think I've said this before, but 'free' seems to attract the worst of humanity.
When I want to give something away, I list it for some nominal fee like $10, then just tell them to keep it. Because when I used to list things for free, I got the dredges of society bothering me. Asking for delivery, asking me to hold it for 3 months til they can find a truck, cussing at me for saying no to both of these, cussing me because I sold it to someone else already, telling me long sob stories to guilt me. I've never had any of that happen when asking for money(except one guy wanted me to deliver it for $20, which was a fair-ish offer).
I wonder if that same 'pay but you'll get it back under the table' model could work for software? At least until the word got out, I guess.
> I hope Marcan can find a new project to take on that doesn't involve all of this mess.
The only way to do that is to never collaborate with anyone else. I hope he'll be someday able to process what happened, why and reach appropriate conclusions. Software development is a social activity, especially with relatively high-visibility projects like Asahi, and it comes with just as usual burden of social troubles as any other kind of social activity.
> Software development is a social activity, especially with relatively high-visibility projects like Asahi, and it comes with just as usual burden of social troubles as any other kind of social activity.
Yes.
> The only way to do that is to never collaborate with anyone else.
Not necessarily. You can also treat project politics and social skills like any other technical skills that you need on your team like network engineering or database optimization.
If you can find trusted collaborators with those social and political skills, you can make a lot of things happen without necessarily being very good at it yourself.
Team building has a lot of parallels with building a full stack technology. Or building a sports team.
It's true, but what I was responding to was "a project to take on that doesn't involve all of this mess".
The real answer is to either learn these skills or, as you suggest, delegate them. Hoping to find something that doesn't involve "all this mess" at all will be fruitless.
That's what I get with my software projects. People tell me that it sucks and I suck at code and other projects have it better, and don't forget to waste months of your time rewriting to Rust, and don't you dare to use unsafe all over your code (see: actix drama)... sigh. But when asked to show their alternative they get silent. So as long as you keep being assertive this is fine. For everyone who comes and behaves like a drama queen you have to prove again and again that talk is cheap and code is how you get the job done. Or you simply ignore them.
In the early aughts, I spent a lot of time writing and maintaining Open Source software. I burned out on that because of rude users. I had one guy track me down offline and phone me at all hours to demand that I drop everything and fix a bug for him. When I pointed out that my day job came first because I have to pay bills, he went on an online screed accusing me of holding him hostage unless he paid for fixes and listing my cell number so people could "encourage me to be a better developer."
In those days, I was part of a core development team for a project with a fairly large community. A few bad users and a few bad development team members is all it takes to poison something like that.
Now I barely even contribute to Open Source projects even when I fix them for my own uses.
Depends on the project. I have found Pinephone users quite nice overall as a kernel developer.
Anyway, if your project involves convincing hundreds of maintainers to increase their cognitive/work load in order to include your fancy new foreign workflow breaking language into their project, you have to expect pushback.
> Open source attracts some of the very worst users.
This has not been my experience. Perhaps consider that the problem is not the users.
> the active sabotage some angry Linux kernel devs are trying to pull because they don't like Rust
On the other hand, users that demand you rewrite the project in their favorite language or otherwise accomodate their preferences over your own are pretty annoying.
> On the other hand, users that demand you rewrite the project in their favorite language or otherwise accomodate their preferences over your own are pretty annoying.
This whole post feels like typical burnout. Imagine porting something as complex as Linux to a platform who's creators actively do not want Linux ported to it. Of course you will burn our eventually. Not to dismiss his experiences, but I wonder if there is some deflection going on here - burnout was happening anyway but blaming others is a good smoke-screen.
>"I can’t even check my CPU temperature” (yes, I seriously got that one"
Actually if this distro is my primary / only one I would like to be able to check CPU, GPU, etc. temperature. It is important to know if cooling is adequate or requires cleaning / repair.
In any case Marcan would be way better off having thick skin. Users will always be assholes (well same is generally true about vendors).
why not tag it as pre-alpha, not suitable for daily use? Saying smoothest linux experience on one side and expecting people to not expect basic features of the hardware working...how does that work?
"Heavily under development and not ready for prime time use" should have been first line in readme and only reply to such feature request.
So it sounds like they bit more than they could chew.
I have been maintaining open source projects, and really: users of open source projects suck. They get your work for free, but it's not enough; they have to be assholes on top of that.
I would say it's more the case that the users who suck are both the loudest and also seem the loudest. If you get 10 people saying thank you and one person cussing you out, it might still ruin your day. And of course a lot of people just quietly use the thing and are happy with it, and you never hear from them at all.
People complain about things that they care about. People also don't usually have as much tact as we would like them to.
I think the best way to deal with this is to just confidently say what you are and are not ready to get done. The social dynamic will always be this way, so we may as well take whatever criticism is useful, leave the rest behind, and move on.
Never ever give away anything for free if you intend to support it is an evergreen advice.
Selling ads? Using it as a gateway to a commercial product? Selling support? Have some genius business plan that allows you to make money in the future? Fine, give it away no strings attached but expecting that users will be grateful is a mistake developers keep repeating. The free users are just as entitled, even more entitled as they don’t have a price tag for your efforts and don’t have a document specifying what are your obligations so they can assume scope of entitlements anyway they wish.
Since you gave it for free, you can’t refund an unhappy customers to make it go away. If it looks like a product, You will be stuck with people who think they did their part by using your products and you failed them. Some may make it a full time job to take a revenge on this injustice.
I’m not even sure that these users are at fault, you actually took something in exchange(like fame, street cred etc) and you are not delivering your part.
Paying users can be incredibly entitled, sometimes even more than people who don’t pay you a dime. The problem is the moment you accept a cent people expect you to do work for them, regardless of whether the money is actually “worth” how much effort needs to go into a feature. The open source projects I’ve worked on get donations but sometimes people will put up like $10 for their pet feature which takes a week to write. Like, thanks for your contribution, but this actually doesn’t affect my priorities at all.
from what I've seen of his grandstanding on the LKML suggesting to bully people on social media, I've lost all respect for the guy. He is a person in power considering all his social media clout, and this is how he uses it. I'm glad he realizes that it's time to sit back and reflect. And I don't mean that in a disrespectful way. He will be of much more use to the community, and more importantly himself after confronting his ego.
There are two types of VIP developers: those that stay in the shadows and do their work (think the Bram Moolenaars and Daniel Stenbergs) and those that seem to spend their entire time picking fights on social media and writing very emotionally charged blog posts that routinely reach the HN front page, because gossip and drama sells.
You gotta have super thick skin to be a maintainer of an opensource project or even be popular on the net these days. Folks are going to come for you for whatever reason, if you read too much into it you're going to have a bad time.
Apple users today are just Windows users with even more entitlement.
Wasn’t always like this, I think. Personally have seen the same with other projects and dealing with proprietary Apple APIs and their walled in garden is hard enough.
A person can be in a tough spot personally and then things seem to spiral out of control around them because that just cannot be 100% isolated from professional stuff or other spheres of life. It seems like this might have happened to Hector based on the post. We've all been there and that part is completely understandable.
> I get that some people might not have liked my Mastodon posts. Yes, I can be abrasive sometimes, and that is a fault I own up to. But this is simply not okay. I cannot work with people who form cliques behind the scenes and lie about their intentions. I cannot work with those who place blame on the messenger, instead of those who are truly toxic in the community.
The abrasiveness though is the reason people react that way. Not everyone is going to respond with "hey that was abrasive, that's not how we do things, here is a better way to phrase it". The majority will simply shut down or start forming cliques in the background. I can't completely blame them either. Here is Hector threatening to launch a shaming social media campaign on kernel devs:
"If shaming on social media does not work, then tell me what does, because I'm out of ideas."
That's not ok. Even if he feels he is right and they are wrong. People will create cliques and talk behind your back if you act that way. People will look on Rust community after this and say "Remember that time when _they_ where threatening kernel devs with social media drama?". It's not right but that's the perception that will last.
> People will look on Rust community after this and say "Remember that time when _they_ where threatening kernel devs with social media drama?". It's not right but that's the perception that will last.
Happened with actix, happened with serde, and now being threatened by kernel contributors. The perception seems at least somewhat based in reality.
There was plenty of indefensible behavior in the Actix debacle, but the reason it blew up was because the maintainer was genuinely wrong and was being a jerk on top of it. The sequence of events was:
1) Issue found by Shnatsel
2) Issue closed as harmless to users by fafhrd91
3) Issue proven harmful to users by Nemo157 and reopened by JohnTitor
4) Issue fixed and closed by fafhrd91
5) Issue proven unfixed and proposed new patch by Nemo157
6) New patch commented "this patch is boring" by fafhrd91
7) Issue is deleted
8) Fix is reversed by fafhrd91, issue still present
A maintainer that rejects a fix for an issue that was proven harmful to users on the basis that it was "boring" and then deletes the issue is a bad maintainer. Death threats and abuse were definitely not the right answer, but public criticism is not unreasonable in such a case. If it were just a hobby project and advertised as such then that would be one thing, but he plastered info about how it was used production by a bunch of big companies on the website. That is not how someone who calls their code "production-ready" acts.
I'm not sure what you're arguing. Are you saying that because the Actix maintainer was "a bad maintainer" that the community shouldn't be held accountable for harassing him?
Rust, which is a language I really enjoy, generates more social media outrage and religious wars than any other technical project I have been following for the past 20 years.
> more social media outrage and religious wars than any other technical project I have been following for the past 20 years.
It is unfortunately wrapped up in larger-scale outrage culture than just within tech/programming circles. Rust as a community is very gay and very trans:
To be clear I am 111% down for that as one of the Alphabet People myself lol. We just can't pretend like it isn't a factor.
Disclaimer: I realize these numbers are probably skewed high due to self-selection of people who are willing to take diversity surveys. The actual percentages are probably somewhat lower, but Rust undoubtedly has the highest concentration of any programming-language community. Zero question.
Personally I'd go with the "biggest worries graph" for an explanation as to why I avoid rust like the plague. If you have half of all respondents say that it's not used enough the corollary they seem to have derived is "let's force it everywhere so it does get used more". Meanwhile forth people are hacking away on building a gui in 300 bytes in a mailing list open since the 80s.
I know which of the two languages was easier and more pleasant to hire for - which should be impossible as I kept getting told no one uses forth.
“The majority of those who consider themselves a member of an underrepresented or marginalized group in technology identify as lesbian, gay, bisexual, or otherwise non-heterosexual. The second most selected option was neurodivergent at 46% followed by trans at 35%.”
Out of 14.5% of the respondents. I wouldn’t call that a very anything community.
You may be misreading these numbers. It’s effectively 7-8% of the respondents who identify as non-heterosexual, which seems roughly in line with the general population (e.g. [0]).
That is assuming everyone who is one of those groups first sees themselves as marginalized. imo it would've been better to not have that first question and just ask people which group they identify with.
That’s a fair point, though I would be surprised if it made a huge difference. The actual survey question was: “Do you consider yourself a member of a group that is underrepresented or marginalized in technology?” I would expect most members of a minority to consider themselves underrepresented just by virtue of being a minority. And as the choice of groups was explicitly given, I assume that the survey allowed to view the groups and then go back to answering the first question.
I'd expect that as well, but we can't really assume that.
Being a minority does not make you underrepresented. Underrepresented means there are fewer than you'd expect, given population-level numbers. In the Rust community it certainly seems true that trans people are overrepresented, though "marginalized" almost certainly still applies. The same goes for LGB, which again does not seem underrepresented in the tech community compared to society writ large, and I think many LGB people probably don't see themselves as "marginalized" in 2025, but I could be wrong.
> I would expect most members of a minority to consider themselves underrepresented just by virtue of being a minority.
I don't see why that would be the case.
And for tech in particular I'd say women (half the population) are underrepresented and LGBT (a definite minority) are not. Marginalization is a bit more complicated but similar.
> Rust undoubtedly has the highest concentration of any programming-language community. Zero question.
How can you know this? What other communities even have such surveys?
I would expect this to be similar in any language. Anecdotally, I see the % of gay/trans/neurodivergent to be much higher in the dev community than the general population, so the numbers don’t look strange to me.
Perhaps it’s more vocal or more visible, but that would require much more analysis to enquire about cause and effect.
> The actual percentages are probably somewhat lower, but Rust undoubtedly has the highest concentration of any programming-language community. Zero question.
This is complete nonsense. We (LGBT) folk are pretty much equally represented in all programming communities. It's just that Rust presents as a very socially activist community, with all the attendant drama and culture war nonsense, including falsely claiming some sort of imprimatur from the LGBT folk to represent them. Cliquey hyper-online gays != the LGBT community.
Fortran, Erlang/OTP, any stack you can think of, will have LGBT devs. Common Lisp has some kickass trans devs. It's not a proliferation of rainbow flag emojis and obnoxious puerile cancel-culture politics that makes one community be 'more' LGBT than another. I won't stand for this kind of erasure of LGBT folk who don't take their assigned place in the culture war barricades.
Rust is a very neat language, but the biggest single barrier to its adoption is the Rust community, and I won't have them hijacking my identity to pretend some moral title to their constant - and deeply unpopular - online brigading, bullying, etc.
Please enlighten me why is this even a question on a language survey. I take many languages survey when they show up and Rust is the only one asking this.
at best this is a case of correlation not causation. but also given that lgbt folks are fairly common in all programming communities, i have a hard time believing these two are related and this is not something more specific to rust's culture
It must be said that from an outsider's point a view, in quite a few aspects it very much sounds like a cult.
Get an HN article about C++, and you can be certain the comment section is going to deteriorate at some point into a religious war mentioning Rust. Get an article about Rust, and there is going to be drama in the comments.
As a programmer that could potential consider Rust, it is off-putting.
I get the opposite experience, never saw those comments chiming in Rust everytime another programming language is mentioned or pushing in to rewrite everything into Rust, but I get comments complaining about such invisible forces.
This has been my experience too. There were a couple of years (like 2016-2018) where I saw a handful of people pushing the RIIR line, but I've seen many more people for many more years complaining about how those RIIR people are everywhere. Any time there's an article on Phoronix that mentions Rust, the trolls come out in droves to whine about how toxic the Rust community is and how they make everything political, while the only people in those threads making anything political are the anti-Rust trolls themselves.
It’s like how people complain about Apple users. You see more threads complaining about how annoying those Apple fanboys are than you see actual Apple fanboys being annoying.
I mean... Every HN thread with Apple discussion is pretty annoying. It might not be annoying to you as an Apple user, but for many of us it's unbearable...
Get an article about Rust, and there’s going to be comments about how drama, zealous Rust is from people that never use Rust. Regardless of its content.
So yeah, typical internet houlier than thou reactions, I wouldn’t read much into them.
> Get an article about Rust, and there’s going to be comments about how drama, zealous Rust is from people that never use Rust.
Of course: You don't have to use Rust to see this very post here on HN, and quite a few other similar ones. Are you saying people just imagine there's a lot of drama around Rust, or what? (That TFA here or in other similar posts are all lies, or outright made-up?) Because to me -- who never use Rust -- it looks like a fact.
Which is a large contributing factor to why I probably never will, either.
You can also be certain when reading a Zig post you will see "Why would I use Zig over Rust?" or "Isn't Zig unsafe?"
They cant help but proselytize. Its like talking to my recent born again christian friend who cant help but steer every conversation to Christianity and reciting scripture. It's infuriating.
Though TBH it very much feels like the cult of OOP that rocked the 90's. And look where that paradigm is now ...
> Though TBH it very much feels like the cult of OOP that rocked the 90's. And look where that paradigm is now ...
It's alive and well. Sure, Java-style OOP might not be, but that's mainly because it was never sensible OOP to begin with.
A bit like "Agile is dead" and everybody hating "Agile". Sure, what they hate is what's been pushed as "Agile" for the last decade or more: ceremoniel-over-flexibility-Scrum, rigid sprints, "user story" as a synonym for "ticket", etc, etc.
Let's hope that it's just "Fauauxp" that, like Fauxgile, is about to be dead. ASAP.
There's something about Rust that draws Zealots (or draws out zealotry in people). It's not at Haskell's level, but there are several culty elements for the fanatics: secret knowledge, being 'chosen' or set aside from the ignorant plebians, and an unshakable belief in a form of rapture when the language will inevitably win when everyone realizes the superiority of monads/memory safety.
Hang on there: the serde issue drama would’ve happened in any other ecosystem and doesn’t quite belong in this list, because it was about shoving a pre-compiled binary into the supply chain.
(The actix drama was stupid IMO and is fair to criticize the community over tho)
> "If shaming on social media does not work, then tell me what does, because I'm out of ideas."
This is just an incredibly odd thing to say. It's so obviously out of line that it seems like someone's joking around.
The Rust community (generally-speaking) just can't see why people have a visceral reaction against them, independent of its technical qualities. In all my years, I've not seen anything like it.
Mightn't it just be that it's a newer technology, which newness has attracted a younger crowd, and this happens to be part of the younger culture right now, more broadly?
I started reading the article, having little background on kernel drama, and ended it thinking to myself, “Jesus, what did this poor guy do to deserve all this hate?”
Then I read the thread you linked and thought, “Oh. That.”
To be clear nobody deserves to be harassed or threatened, but Hector’s messages make it clear he is astoundingly good at making himself into a victim of injustice. When his messages mentioned “cancer” I immediately thought that meant another kernel dev told someone to get cancer or die of cancer or something, which would be completely unacceptable. He was using the word metaphorically to describe the way Rust is slowly making its way into the kernel, like a cancer growing.
How anyone (read: Hector) could think this requires CoC action is baffling to me. Insane language policing.
Is it possible to agree that having one’s work compared to cancer could be insulting but also that trying to publicly shame people about it isn’t the right response?
It seems at the heart of the issue is the vision for the future of Linux kernel.
One group believes it is Rust (progressives), one group doesn't believe that and wants to continue with C (conservatives).
If they cannot find a way to live at peace with each other, I think the only solution is for the Rust folks to start building the kernel in Rust and not try to "convert" the existing kernel to Rust piece by piece.
Why they cannot live in peace seems to be: a way that C kernel folks would not need to deal with Rust code.
At the core, the story is not that different from introducing new languages to a project.
You are introducing a new tax on everyone to pay for the new goodies you like, and those who are going to be taxed and don't like the new goodies are resisting.
Then entertain his question and tell us what is? Bringing up people’s attention to the matter to finally somehow resolve the situation is his last resort, after spending years trying to upstream even trivial patches. You can eat your cake and have it too - you can’t say you want rust in the kernel and then sabotage any upstreaming efforts
> Then entertain his question and tell us what is? Bringing up people’s attention to the matter to finally somehow resolve the situation is his last resort, after spending years trying to upstream even trivial patches.
When upstream won't work with you, the answer is to maintain a separate tree. Yes, it's a lot of work to maintain a separate tree. No, you won't get as much use if you're in a separate tree.
I don't think that they owe it to him, but I do think it's shitty to string people along for several years without merging their code, often refusing to even review their code, without giving any technical reasons and while behind the scenes straight-up conspiring to sabotage their efforts.
I mean, that's the kind of abusive dynamic I'd expect from a horrible corporation: stringing along underpaid or unpaid interns for several years and refusing to hire them at the end of it without giving any actual feedback.
> Then entertain his question and tell us what is?
In this particular case, Hector himself with the blog post hints at it, but a lot of damage has been done already: "I am working on personal issues currently, I'd like to step back for a while and will not be contributing. Thank you, all".
> Bringing up people’s attention to the matter to finally somehow resolve the situation
Not everything has a clear and fast resolution. I think Hector's team were hoping the resolution to be "Shut up everyone, we're doing Rust now, this is all merging in and that's that!". But it could have been "Shut up everyone, we're not doing Rust any longer". They would have been even more upset saying "this is a leadership failure, they're on the wrong side of history" and so on.
> you can’t say you want rust in the kernel and then sabotage any upstreaming efforts
Two wrongs don't make a right though. Call people out and ask them to explain their position, get others on your side. But threatening to drag their names all over Bluesky or X or Reddit or whatever latest thing is, is not productive, even more so it's anti-productive.
> Not everything has a clear and fast resolution. I think Hector's team were hoping the resolution to be "Shut up everyone, we're doing Rust now, this is all merging in and that's that.!". But it could have been "Shut up everyone, we're not doing Rust any longer". They would have been even more upset saying "this is a leadership failure, they're on the wrong side of history" and so on.
I'd argue that we're basically at the point where that _is_ what the de facto policy is, except without it being actually stated. There's a subsystem maintainer blocking any Rust code from being merged (even to be imported as a dependency from outside their subsystem) who said they will do "everything in their power" to stop Rust from being merged into any part of the kernel, and when people asked Linus to clarify whether he still thought it was viable to have Rust in the kernel, he said nothing. Hector made the infamous comment about social media, and _then_ Linus stepped in to say that we needed technical debate rather than social media brigading, which gives the not-so-great precedent that invoking social media was actually more effective at getting some sort of response than the technical debate that he actually said he wants. So now, the status quo is that someone with the power to completely block any progress towards actually including any amount of Rust in the kernel will presumably continue to do so, but Linus still is sticking to the line that we can have "technical debate" about it even though the outcome is predetermined to end in failure.
You're right that not everything has a clear and fast resolution, but given that the only possible ways for this to end other than just making the "no Rust in the kernel" policy explicit is either for Linus overrule the maintainer blocking any Rust code from being merged or every single patch containing any Rust code to be blocked, it seems pretty clear to me that the way things are now is just a slower, less clear version of the negative outcome, so having a clear and fast resolution with an undesired outcome would be far better. This seems like the real cause of frustration that Hector has; it's hard not to feel like the reasons for this path to "resolution" was picked over just admitting that it's essentially official policy that Rust isn't allowed for reasons that are ultimately purely social rather than technical. The correct resolution in my opinion would be if Linus said something like "regardless of my opinion on whether Rust should be allowed in the kernel, I'm not willing to overrule the decision of the subsystem maintainer in this case, so the current status quo will remain unless someone is able to convince people to merge things on their own". My best guess for why he didn't want to do that is that it would essentially paint a target on any maintainers refusing to merge Rust code, which is understandable but seems like it will just cause more frustration in the long run than simply ending acknowledging the reality of the current situation.
> which gives the not-so-great precedent that invoking social media was actually more effective at getting some sort of response than the technical debate that he actually said he wants. So now, the status quo is that someone with the power to completely block any progress towards actually including any amount of Rust in the kernel will presumably continue to do so, but Linus still is sticking to the line that we can have "technical debate" about it even though the outcome is predetermined to end in failure.
It's true but sort of assumes that Linus is an automaton, like a corporation: if you threaten with a social media drama, then he'll respond. The problem is that it feels he was forced to respond, and he didn't really like kernel devs being part of the social media drama. So he responded, so in a strange way, he emerged as the calm voice of reason. And it left a long unpleasant memory in the community regarding it.
> it will just cause more frustration in the long run than simply ending acknowledging the reality of the current situation.
Sadly, I think that's what will happen.
> I'd argue that we're basically at the point where that _is_ what the de facto policy is, except without it being actually stated.
It does seem that way, I agree with you, but I think this made it worse as you highlighted already. So it was an uphill road, but now the hill got steeper and taller. Another way this could have played out is Hector wrote a blog post saying "I am having personal issues, I am frustrated, I am stepping down". Let people figure out more details. But getting into a public spat with Linux devs was not productive for his and his team's goals. He hurt his team (Rust + Asahi) more than he helped in the end.
Marcan links to an email by Ted Tso'o (https://lore.kernel.org/lkml/20250208204416.GL1130956@mit.ed...) that is interesting to read. Although it starts on a polarising note ("thin blue line"), it does a good job of explaining the difficulties that Linux maintainers face and why they make the choices they do.
It makes sense to be extremely adversarial about accepting code because they're on the hook for maintaining it after that. They have maximum leverage at review time, and 0 leverage after. It also makes sense to relax that attitude for someone in the old boys' network because you know they'll help maintain it in the future. So far so good. A really good look into his perspective.
And then he can't help himself. After being so reasonable, he throws shade on Rust. Shade that is just unfortunately, just false?
- "an upstream language
community which refuses to make any kind of backwards compatibility
guarantees" -> Rust has a stability guarantee since 1.0 in 2015. Any backwards incompatibilities are explicitly opt-in through the edition system, or fixing a compiler bug.
- "which is actively hostile to a second Rust compiler
implementation" - except that isn't true? Here's the maintainer on the gccrs project (a second Rust compiler implementation), posting on the official Rust Blog -> "The amount of help we have received from Rust folks is great, and we think gccrs can be an interesting project for a wide range of users." (https://blog.rust-lang.org/2024/11/07/gccrs-an-alternative-c...)
This is par for the course I guess, and what exhausts folks like marcan. I wouldn't want to work with someone like Ted Tso'o, who clearly has a penchant for flame wars and isn't interested in being truthful.
> And then he can't help himself. After being so reasonable, he throws shade on Rust. Shade that is just unfortunately, just false?
Many discussions online (and offline) suffer from a huge-group of people who just can't stop themselves from making their knee-jerk reactions public, and then not thinking about it more.
I remember the "Filesystem in Rust" video (https://www.youtube.com/watch?v=WiPp9YEBV0Q&t=1529s) where there are people who misunderstand what the "plan" is, and argue against being forced to use Rust in the Kernel, while the speaker is literally standing in front of them and saying "no one will be forced to use Rust in the Kernel".
You can literally shove facts in someone's face, and they won't admit to being wrong or misunderstand, and instead continue to argue against some points whose premise isn't even true.
I personally don't know how to deal with this either, and tend to just leave/stop responding when it becomes clear people aren't looking to collaborate/learn together, but instead just wanna prove their point somehow and that's the most important part for them.
If you watch that YouTube link, you'll see the same guy Ted Tso'o accusing the speaker of wanting to convert people to the "religion promulgated by Rust". I think he apologised for this flagrant comment, but this email shows he hasn't changed his behaviour in the slightest.
His email seems very reasonable to me (the thin-blue-line comment is a bit weird though). To me the problem are that some Rust people seem to expect that the Linux maintainers (that put in a tremendous amount of work) just have to go out of their way to help them achieve their goals - even if the maintainers are not themselves convinced about it and later have to carry the burden.
How many times will this need to be said: the Rust maintainers have committed to handling all maintenance of Rust code, and handling all breakage of their code by changes on the C side. The only "burden" the C maintainers have to carry is to CC a couple of extra people on commits when APIs change.
> Then I think we need a clear statement from Linus how he will be working. If he is build testing rust or not.
> Without that I don't think the Rust team should be saying "any changes on the C side rests entirely on the Rust side's shoulders".
> It is clearly not the process if Linus is build testing rust and rejecting PRs that fail to build.
For clarity, tree-wide fixes for C in the kernel are automated via Coccinelle. Coccinelle for Rust is constantly unstable and broken which is why manual fixes are required. Does this help to explain the burden that C developers are facing because of Rust and how it is in addition to their existing workloads?
> For clarity, tree-wide fixes for C in the kernel are automated via Coccinelle. Coccinelle for Rust is constantly unstable and broken which is why manual fixes are required. Does this help to explain the burden that C developers are facing because of Rust and how it is in addition to their existing workloads?
Yes actually, I really wish someone would bring that sort of thing to the forefront, because that's a great spot to welcome new contributors.
They've said that, but nobody believes them, and can you blame them given we JUST saw another big rust maintainer resign?
I'd be suspicious these guys aren't in it for the long haul if they don't get their way and will leave the Rust they shoved in the kernel to bit rot if things don't go their way w.r.t enough rust adoption fast enough. "If you don't let us add even more rust, we will resign from the project and leave you to maintain the rust that's already there and that we added, and that you said you didn't want to add because you didn't trust us to not resign".
Rust 4 Linux people just proving the points of the maintainers scared of abandonment.
The rust 4 Linux people unfortunarely give the impression of caring more about rust than about the kernel, and it's clear that many are willing to make that perfectly clear by abandoning the larger project.
The whole thing needs to be scrapped and rethought with better and more committed leadership. This past 6 months to a year has been embarrassing and done nothing but confirm the fears of anti rust people.
> However, exceptionally, for Rust, a subsystem may allow to temporarily break Rust code. The intention is to facilitate friendly adoption of Rust in a subsystem without introducing a burden to existing maintainers who may be working on urgent fixes for the C side. The breakage should nevertheless be fixed as soon as possible, ideally before the breakage reaches Linus.
Yes, "breakage should be fixed as soon as possible". Not "Rust for Linux team will fix the breakage as soon as possible".
The exception is allowing the subsystem to break Rust code temporarily. If you accept a patch in C that breaks Rust code, and the Rust for Linux team doesn't fix it quickly enough, you either need to fix the Rust code yourself, remove it, or re-write it in C. All of this would take time and energy from all the non-R4L kernel devs.
This is why people are reluctant to accept too much mixing of the C and Rust codebases, because even the Rust for Linux team isn't promising to fix breakages in Rust for Linux code.
Just to be clear, this is the situation. A rando submits a C patch that breaks Rust code, a maintainer accepts this patch and then demands that the R4L devs fix the breakage introduced by someone else and reviewed by themselves. The rando who broke the thing isn't around, and the person who reviewed the change takes no responsibility.
Have I gotten that right?
And then you're presenting this situation as "the Rust for Linux team isn't promising to fix breakages in Rust for Linux code". Somewhat disingenuous.
To say Rust won't cause extra work for C developers, this is exactly what people are claiming. This is from the comment I originally replied to.
> The Rust maintainers have committed to handling all maintenance of Rust code, and handling all breakage of their code by changes on the C side. The only "burden" the C maintainers have to carry is to CC a couple of extra people on commits when APIs change.
But this is not actually true it seems. Even the Rust for Linux policy doesn't say this. But because of the incorrect statement that keeps getting repeated, people are calling Kernel devs unreasonable for being reluctant to Rust patches.
> A rando submits a C patch that breaks Rust code, a maintainer accepts this patch and then demands that the R4L devs fix the breakage introduced by someone else and reviewed by themselves. The rando who broke the thing isn't around, and the person who reviewed the change takes no responsibility.
Well, firstly, "randos" aren't getting their patches easily accepted anyway.
And secondly, what's the problem with this? You want one of the following options:
1. Everyone who wants to submit a patch also be proficient in Rust,
Or
2. You want the reviewer to also be proficient in Rust
You don't think that's an unnecessary burden for the existing maintainers?
The burden should be on the people who want to introduce the second language.
This is not how the kernel works. You cannot rely on someone's "commitment" or "promise". Kernel maintainers was to have very good control over the kernel and they want strong separation of concern. As long as this is not delivered, it will be very hard to accept the Rust changes.
At some level this is just concern trolling. There is nothing the Rust developers could possibly do or say that would alleviate the concern you've just expressed. You are asking for something that is impossible.
What could they possibly "deliver" beyond a strong commitment to fix the code in a timely manner themselves?
It is not concern trolling. It is a harsh disagreement.
Some kernel developers really do feel that any Rust in the kernel will eventually mean that Rust gets accepted as a kernel language, and that they will eventually have to support it, and they the only way to prevent this is to stop any Rust development right now.
And yes, there's nothing that the R4L group can offer to be get around that belief. There isn't any compromise on this. Either Rust is tried, then spreads, then is accepted, or it's snuffed out right now.
A big mistake by R4L people is seeing anti-Rust arguments as "unfair" and "nontechnical." But it is a highly technical argument about the health of the project (though sometimes wrapped in abusive language). Rust is very scary, and calling out scared people as being unfair is not effective.
There is nothing to deliver that would satisfy this argument. Pretending like the disagreement is about a failure of the R4L folks to do "enough" when in fact there is nothing they could do is toxic behavior.
If you go back digging in the LKML archives, Christoph's initial response to Rust was more of a "let's prove it can be useful first with some drivers"
That has now been done. People (particularly Marcan) spent thousands of hours writing complex and highly functional drivers in Rust and proved out the viability, and now the goalposts are being moved.
R4L people are allowed to get upset about people playing lucy-with-the-football like this wasting their f***ing time.
> There is nothing the Rust developers could possibly do or say that would alleviate the concern you've just exprssed.
They could do exactly what Ted Ts'o suggested in his email [1] that Marcan cited: They could integrate more into the existing kernel-development community, contribute to Linux in general, not just in relation to their pet projects, and over time earn trust that, when they make promises with long time horizons, they can actually keep them. Because, if they can't keep those promises, whoever lets their code into the kernel ends up having to keep their promises for them.
Many of them have, in fact, done all of those things, and have done them over a time horizon measured in years. Many of the R4L developers are paid by their employers specifically to work on R4L and can therefore be considered reasonably reliable and not drive-by contributors.
Many existing maintainers are not "general contributors"
It is unreasonable (and a recipe for long-term project failure) to expect every new contributor to spend years doing work they don't want to do (and are not paid to do) before trusting them to work on the things they do want (and are paid) to do.
Christoph refused to take onboard a new maintainer. The fight from last August was about subsystem devs refusing to document the precise semantics of their C APIs. These are signs of fief-building that would be equally dangerous to the long-term health of the project if Rust was not involved whatsoever.
I disagree. If you want to provide technical leadership by massively changing the organization and tooling of a huge project that has been around a long time, it should be absolutely mandatory to spend years building trust and doing work that you don't want to do.
That's just how programming on teams and trust and teamwork actually works in the real world. Especially on a deadly serious not-hobby project like the kernel.
Sometimes you are gonna have to do work that doesn't excite you. That's life doing professional programming.
Everything Ted Tso recommended is just common sense teamwork 101 stuff and it's just generally good advice for programmers in their careers. The inability of rust people to follow it will only hurt them and doom their desire to be accepted by larger more important projects in the long run. Programming on a team is a social affair and pretending you don't have to play by the rules because you have such great technical leadership is arrogant.
> It is unreasonable (and a recipe for long-term project failure) to expect every new contributor to spend years doing work they don't want to do (and are not paid to do) before trusting them to work on the things they do want (and are paid) to do.
It is absolutely reasonable if the work they want to do is to refactor the entire project.
it's like saying to people that they cannot add for example npu subsystem to kernel because they should first work for 10 years in other subsystems like filesystems on with they know little about.
sound absurd? just replace subsystems in above with C/Rust and the rest is the same.
Folks that maintain rust are responsible for rust code, if they won't deliver what is needed, their rust subsystem will fail, not C codebase, so it's in their own interests to keep things smooth.
my feeling is that some people think that C is the elite language and rust is just something kids like to play with nowadays, they do not want learn why some folks like that language or what it even is about.
I think the same discussion is when Linux people hate systemd, they usually have single argument that it's agains Unix spirit and have no other arguments without understanding why other thinks may like that init system.
> it's like saying to people that they cannot add for example npu subsystem to kernel because they should first work for 10 years in other subsystems like filesystems on with they know little about. sound absurd? just replace subsystems in above with C/Rust and the rest is the same.
No it's not. What you're missing is that if the Rust folks are unable, for whatever reasons, to keep their promises, it falls on the up-tree maintainers to maintain their code. Which, being Rust code, implies that the existing maintainers will have to know Rust. Which they don't. Which makes it very expensive for them to keep those broken promises.
To look at it another way, the existing maintainers probably have a little formula like this in their heads:
Expected(up-tree burden for accepting subsystem X) = Probability(X's advocates can't keep their long-term promises) * Expected(cost of maintaining X for existing up-tree maintainers).
For any subsystem X that's based on Rust, the second term on the right hand side of that equation will be unusually large because the existing up-tree maintainers aren't Rust programmers. Therefore, for any fixed level of burden that up-tree maintainers are willing to accept to take on a new subsystem, they must keep the first term correspondingly small and therefore will require stronger evidence that the subsystem's advocates can keep their promises if that subsystem is based on Rust.
In short, if you're advocating for a Rust subsystem to be included in Linux, you should expect a higher than usual evidence bar to be applied to your promises to soak up any toil generated by the inclusion of your subsystem. It’s completely sensible.
> What you're missing is that if the Rust folks are unable, for whatever reasons, to keep their promises, it falls on the up-tree maintainers to maintain their code.
But that's the thing, the deal was that existing maintainers do not need to maintain that code.
Their role is to just forward issues/breaking changes to rust maintainer in case those were omitted in CC.
You are using the same argument that was explained multiple times already in this thread: no one is forcing anybody to learn rust.
The point is that “the deal” assumes that the Rust folks will keep their promises for the long haul. Which kernel maintainers, who have witnessed similar promises fall flat, are not willing to trust at face value.
What if, in years to come, the R4L effort peters out? Who will keep their promises then? And what will it cost those people to keep those broken promises?
The existing kernel maintainers mostly believe that the answers to the questions are “we will get stuck with the burden” and “it will be very expensive since we are not Rust programmers.”
Isn't it the same as with support for old hardware? Alpha arch, intel itanium, floppy drives?
Those are all in similar situation, where there is noone to maintain it as none of maintsiners have access to such hardware to event test of that is working correctly.
From time to time we see that such thing is discovered that is not working at all for long time and noone noticed and is dropped from kernel.
The same would happen to rust if noone would like to maintain it.
Rust for Linux is provided as experimental thing and if it won't gain traction it will be dropped in the same way curl dropped it.
The reason the maintainers can drop support for hardware nobody uses is that dropping support won't harm end users. The same cannot be expected of Rust in the kernel. The Rust For Linux folks, like most sensible programmers, intend to have impact. They are aiming to create abstractions and drivers that will deliver the benefits of Rust to users widely, eliminating classes memory errors, data races, and logic bugs. Rust will not be limited to largely disposable parts of Linux. Once it reaches even a small degree of inclusion it will be hard to remove without affecting end users substantially.
> You are using the same argument that was explained multiple times already in this thread: no one is forcing anybody to learn rust.
I think this sort of statement is what is setting the maintainers against the R4L campaigners.
In casual conversation, campaigners say "No one is being forced to learn Rust". In the official statements (see upthread where I made my previous reply) it's made very clear that the maintainers will be forced to learn Rust.
The official policy trumps any casual statement made while proselytising.
Repeating the casual statement while having a different policy comes across as very dishonest on the part of the campaigners when delivered to the maintainers.
The issue with systemd was that many people felt that it was pushed onto them while previously such things would just exist and got adopted slowly if people liked it and then actively adopted it. This model worked fine, e.g. there were many different window managers, editors, etc. and people just used what they liked. For init systems, distributions suddenly decided that only systemd is supported and left people who did not want it out in the cold. It is similar with Rust. It is not an offer, but something imposed onto people who have no interest in it (here: kernel maintainers).
If users of other init systems don't want to make the substantial investment in maintaining support for those other init systems, then their complaints weren't worth much.
To start, not resigning when things don't go their way. That tendency is doing a lot to make the claim of rust people saying they will handle the burden of rust code unbelievable.
The standard procedure is to maintain a fork/patchset that does what you want and you maintain it for years proving that you will do the work you committed to.
Once it’s been around long enough, it has a much better chance of being merged to main.
That has already been the case with Asahi Linux - for years. It exists as a series of forked packages.
The thing is, you do still have to present a light at the end of the tunnel. If, after years of time investment and proven commitment, you're still being fed a bunch of non-technical BS excuses and roadblocks, people are going to start getting real upset.
However, it may only get merged in by being conceptually re-thought and reimplemented, like the Linux USB or KGI projects back in the day.
The general pushback for changes in Linux are against large impactful changes. They want your code to be small fixes they can fully understand, or drivers that can be excluded from the build system if they start to crash or aren't updated to a new API change.
You can't take a years-maintained external codebase and necessarily convert it to an incremental stream of small patches and optional features for upstream maintainers, unless you knew to impose that sort of restriction on yourself as a downstream maintainer.
As a non-interested observer; I think it will need to be said until the commitment becomes credible. I don't know how it would become credible, but it's clearly not considered credible by at least those of the kernel maintainers who are worried about the maintenance burden of accepting rust patches.
> the Rust maintainers have committed to handling all maintenance of Rust code, and handling all breakage of their code by changes on the C side.
How have they "committed"? By saying they commit[1], I presume -- but what more? Anyone can say anything. I think what makes the "old guard" kernel maintainers nervous is the lack of a track record.
And yes, I know that's a kind of a lifting-yourself-by-your-bootstraps problem. And no, I don't know of any solution to that. But I do know that, like baron Münchhausen, you can't ride your high horse around the swamp before you've pulled your self out of it.
___
[1]: And, as others in this thread have shown, that's apparently just out of one side of their collective mouth: The official "Rust kernel policy" says otherwise.
> "the thin-blue-line comment is a bit weird though"
In US, "thin blue line" is a colloquialism for police officers who typically wear blue and "toe the line." You should not be downvoted/shadowbanned/abused for your post, IMHO.
> Ted Tso'o accusing the speaker of wanting to convert people to the "religion promulgated by Rust"
Given the online temper tantrum thrown by marcan, Ted Tso'o's comment seems totally reasonable, regardless of one's opinion of Rust in the Linux kernel.
> tharne 4 minutes ago | parent | context | flag | on: Resigning as Asahi Linux project lead
>
> > Ted Tso'o accusing the speaker of wanting to convert people to the "religion promulgated by Rust"
> That seems totally reasonable. Putting aside the technical merits of the Rust language for the moment, the Rust community suffers from many of the same issues currently hobbling the Democratic Party in the United States. Namely, it often acts like a fundamentalist religion where anyone who dares dissent or question something is immediately accused of one or another moral failings. People are sick of this nonsense and are willing to say something about it.
It's really interesting that every time I open a thread like this, countless people come out swinging with this claim that Rust is totally this religion and cult, while the rest of the thread will be full of C evangelism and vague rhetorics about how nothing like this ever works, while actively contributing to making sure it won't this time either.
99% of insufferable Rust vs. C interactions I've come across it was the C fella being the asshole. So sorry, but no, not very convincing or "totally reasonable" at all.
> 99% of insufferable Rust vs. C interactions I've come across it was the C fella being the asshole. So sorry, but no, not very convincing or "totally reasonable" at all.
This has also been my observation as a C++ developer who finds themselves in a fair few C/C++-aligned spaces. There are exceptions, but in most of those spaces the amount of Rust Derangement Syndrome I've witnessed is honestly kind of tiresome at this point.
Quite frankly, if I had the realization that despite assurances to the contrary, that my contributions to a project had been sabotaged for months or even years up to that point, I would have also had a hard time keeping a smile on my face.
This is ultimately what this drama comes down to. Not if Rust should or shouldn't be in the kernel, but with kernel maintainers' broken promises and being coy with intentions until there is no other option than to be honest, with the reveal that whatever time and effort a contributor had put in was a waste from the start.
It seems like the folks who didn't want Rust in the kernel will be getting their way in the end, but I had better never hear another complaint about the kernel not being able to attract new talent.
I can't believe you're the first person I find in this conversation who raises this issue. This is the exact reason why Marcan flipped his lid. Linus publicly championed a very technically complex initiative and then left all those contributors to the wolves when things didn't progress without a hiccup. Especially damning when you consider that at every step, the fief lords in Linux have seemingly done everything in their power to set up the r4l people for failure and Linus hasn't so much as squeaked at them. He personally cut the knot and asserted that Rust is Linux's future, but he constantly allows those below him to relitigate the issue with new contributors (who can't fight back because even though they're contributing by the supposed rules, they don't have enough social buy-in).
> my contributions to a project had been sabotaged
I really wish people would stop throwing around the word "sabotaged". No one "sabotaged" anything. The opposition has been public from the beginning.
If I'm opposed to something, and someone asks my opinion about it in a private conversation, it is not "sabotage" to express my opinion. So far I haven't seen any evidence that those opposed to a mixed-language code base organized behind the scenes to hamper progress in anyway. Instead, their opposition has been public and in most cases instant.
Are people not allowed to be opposed to things anymore?
The distinction in this particular case is that Rust tried to "downvote" a lkml discussion to get code merged. No one much cares about rendering color in HN comments, except to the extent that we're having this discussion (which you found valuable enough to contribute to!) in an invisible flagged subthread because Rust people don't want to have it, apparently.
It's just tiresome. And it boiled over here, because no matter how enthusiastic the Rust people can be their youthful exuberance pales in influence in comparison with the the talent and impact of the Linux Kernel maintainers. And the resulting tantrum shows it.
>The distinction in this particular case is that Rust tried to "downvote" a lkml discussion to get code merged.
You're attributing to "the Rust community" an imaginary offense that did not actually happen that way and couldn't be attributed that way even if it did. And then you make claims about how "the Rust community" is toxic. Right.
We have used that video as an exercise in how not to achieve change. Assuming everyone is acting in good faith, the presenter missed the opportunity to build consensus before the talk, Tsu unwilling to budge a bit, but most of all the moderator unable to prevent the situation from exploding. This could have been handled much better by each of them.
In contrast to the parent: yes, the presenter says „you don’t have to use rust, we are not forcing you“ but he fails to address the concern that a change they introduce would error downstream and someone else had to clean up afterwards.
>In contrast to the parent: yes, the presenter says „you don’t have to use rust, we are not forcing you“ but he fails to address the concern that a change they introduce would error downstream and someone else had to clean up afterwards.
He did not fail to address that concern. And then Ted shouted him down for 2 minutes such that he couldn't get 2 syllables in to respond.
Why would we assume that Ted repeatedly using strawman fallacies, bleating appeals to emotion and acting like a victim...all the while shouting people down...evidence of "acting in good faith"?
When you shout over someone like that you're nothing but a bully.
> he fails to address the concern that a change they introduce would error downstream and someone else had to clean up afterwards.
Because that "concern" was a strawman. It demonstrated that Ted either did not understand what the presenters were asking for, or simply didn't like others asking him to do something, because he's very important and nobody tells him what to do.
As has been exhaustively explained by others in previous HN threads and elsewhere: the Rust developers were asking to be informed of changes so that Rust developers could update their code to accommodate the change.
Ted loses his shit and starts shouting nonsense about others forcing people to learn Rust, and so on.
> but most of all the moderator unable to prevent the situation from exploding
When someone is being abusive to others, the issue is never "the people on the receiving end are not handling it as best they can."
Further: did it occur to you that Ted's infamous short temper, and his "status" as a senior kernel developer, might be why the moderator was hesitating to respond?
Imagine how Ted would have reacted if he was told to speak respectfully, lower his voice, and stop talking over others. Imagine how the army of nerds who think Ted's behavior was acceptable or understandable.
I don't understand how abusive bullies like Ted are allowed the privilege of being a senior kernel developer. This feels, in the end, like the fault of Linus, for allowing abusive maintainers to maintain their grip.
Linus was the original abusive bully maintainer, that's how. He's improved his personal use of language, but the culture that he initiated continues unabated. Linux's existing success as a project is used as evidence that it doesn't need any changes to the kernel maintainers' culture.
> As has been exhaustively explained by others in previous HN threads and elsewhere: the Rust developers were asking to be informed of changes so that Rust developers could update their code to accommodate the change.
I don't understand why you don't see this as "a really big deal". The C developers make a breaking change. They fix all the C code, then they write an email to the Rust devs explaining the changes.
Then the process of making the change stops, and the C devs have to wait for a Rust dev to read the email, review the C changes, fix and test the resulting rust, and check in the update. (including any review process there is on the rust side.)
Is it hours, days, or weeks? Are there 2 people that know and can fix the code, or are there 100's. Do the C devs have visibility into the Rust org to know its being well run and risks are mitigated?
This is adding a hard dependency on a third party organization.
I would never dream of introducing this kind of dependency in my company or code.
This is kernel development we're talking about. It progresses carefully, not a the breakneck pace of a continuous integration SaaS platform that is single-minded about pushing features out as quickly as possible.
A better analogy would be like an API inside of a monolithic app that has multiple consumers on different teams. One team consumes the API and wants to be notified of breaking changes. The other team says "Nah, too much work" and wants to be able to break the API without worrying about consequences.
If having multiple consumers of an API or interface is a goal, you make communication a priority.
> "no one will be forced to use Rust in the Kernel"
Is this true, though? One reason for this altercation seems to be the basic circumstance that in Linux kernel development, if there is a dependency between two pieces of code A and B, the responsibility to keep B consistent with changes to A lies, in order, with anyone proposing patches to A, the subsystem maintainer for A, and finally the subsystem maintainer for B. If B is Rust code, such as a binding, then that's potentially up to 3 people who don't want to use Rust being forced to use Rust.
They're not "forced to use Rust". They are maybe forced to work with Rust developers of whichever subsystem needs to be updated, but that would always have been the case with the C developers of whichever subsystem needs to be updated too.
I don't think that is a correct interpretation. As I understand it, Linux does not have a notion of someone being obliged to help facilitate a patch, especially if it's not the narrow case of a maintainer and a patch to the subsystem they are in charge of. What do you do if you are a C developer modifying system A, your change has implications for system B which includes Rust code, and none of the Rust developers involved with B care to volunteer time to draft the necessary changes to Rust code for you?
The same situation of course also arises between C-only subsystems, but then the natural solution is that you have to go and understand system B well enough yourself that you can make the necessary changes to it and submit them as part of your patch. In that situation you are "forced to use C", but that's a free square because you are always forced to use C to contribute to Linux code.
>They're not "forced to use Rust". They are maybe forced to work with Rust developers of whichever subsystem needs to be updated
So if the maintainer of subsystem X can be forced to work with the rust developers of their own subsystem, then that rust developer just got promoted to co-maintainer with veto power. Effectively that's what they'd be, right? I can see why maintainers might not like that. Especially if they don't think the rust dev is enough of a subject matter expert on the subsystem.
If a subsystem C developer makes a change and introduces a bug in another driver or subsystem (also written in C) as a result, then you would expect them to be able to help at least insofar as explaining what they changed.
I've been in a spot kinda like this. I've maintained C++ with python interfaces. In my case I wrote both. I know how interlocked the changes were. If I touched code that was exposed to the python, I updated the python interface and the consumers of that python interface.
It was nothing like making changes that cut across into another developer's C++ code (hell, I would even update their python interfaces/consumers too). That was temporary coordination. The python part was much more frequent and required much more detailed understanding of the internal APIs, not just the surface.
Having someone else responsible for the python part would have come at a huge cost to velocity as the vast majority of my changes would be blocked on their portion. It's ridiculous to imply it's equivalent to coordinating changes with another subsystem.
It's absolutely not true, it's one of the lies being told by Rust 4 Linux people. The end goal is absolutely to replace every last line of C code with Rust, and that's what they will openly tell you if you speak to them behind closed doors. That's why there is always an implicit threat directed at the C maintainers about job loss or "being on the right side of history". The Rust 4 Linux people are absolutely attempting a hostile takeover and nobody should believe a word that comes out of their mouths in public mailing lists when they are contradicting it so consistently behind closed doors.
> You can literally shove facts in someone's face, and they won't admit to being wrong or misunderstand, and instead continue to argue against some points whose premise isn't even true.
This is like probably 80% of people and fundamentally why the world is a hellscape instead of a utopia.
The speaker doesn't understand the audience question and doesn't respond to it.
The audience member points out that they shouldn't encode the semantics into the Rust type system because that would mean that refactoring the C code breaks Rust, which is not an acceptable situation. The speaker responds to this by saying essentially "tell me what the semantics are and I'll encode them in the Rust type system." That's maximally missing the point.
The proposal would cause large classes of changes to C to break the build, which would dramatically slow down kernel development, even if a small handful of Rust volunteers agree to eventually come in and fix the build.
> You can literally shove facts in someone's face, and they won't admit to being wrong or misunderstand, and instead continue to argue against some points whose premise isn't even true.
I have to say that I used to be excited about Rust, but the Rust community seems very toxic to me. I see a lot of anger, aggression, vindictiveness, public drama, etc. On HN you not infrequently see down voting to indicate disagreement. These clashes with the Linux maintainers look really bad for Rust to me. So bad that I'm pretty convinced Rust as a language is over if they're no longer banging on the technical merits and are instead banging on the table.
I'm sure there are great things about the community. But I would encourage the community to have higher standards of behavior if they want to be taken seriously. The Linux team seem like they're trying to look beyond the childishness because they are optimistic about the technical merits, but they must be so tired of the drama.
> I have to say that I used to be excited about Rust, but the Rust community seems very toxic to me. I see a lot of anger, aggression, vindictiveness, public drama, etc.
I had the same impression.
Why all this drama is 90% of the time around Rust people?
Because many of them are still relatively young. There is nothing wrong with youth, but it can contribute to over-zealousness.
Also, for many of them, Rust is the first systems language they've ever touched. And that fact alone excites them. Because now they can "dream big" too.
But they have bought into the whole C/C++ are by default insecure and therefore garbage. In their mind, no mortal could ever write so much as a single safe function in those languages. So their points of view are always going to be based on that premise.
What they fail to recognize is that an operating system kernel, by virtue of the tasks it has to perform- things like mapping and unmapping memory, reading/writing hardware registers, interacting with peripherals, initiating dma transfers, context switching, etc.- have effects on the underlying hardware and the runtime environment; effects that neither the type system nor the temporal memory safety of Rust can model, because it happens at a level lower than the language itself. Rust's safety guarantees are helpful, but they are not infallible at that level. The kernel literally changes the machine out from under you.
They further fail to appreciate the significant impedance mismatch between C and Rust. When one language has concepts that are in fact constraints that another language simply does not have, there is going to be friction around the edges. Friction means more work. For everyone. From planning to coding to testing to rollout.
So you have well-intentioned, excited, but still self-righteous developers operating from what they perceive to be a position of superiority, who silently look down upon the C developers, and behave in a manner that (to outsiders at least) demonstrates that they really do believe they're better, even if they don't come right out and say it.
Just read the comments in any thread involving Rust. It is inconceivable to them that anybody would be so stupid or naive as to question the utility of the Rust language. To them, the language is unassailable.
The petty drama and social media brigading on top of it, along with the propensity to quit when the going gets tough, it's pretty easy to see why some people feel the way they do about the whole thing.
A programming language is not a religion. It is not a way of life. It is a tool. It's not like it's a text editor or something.
> But they have bought into the whole C/C++ are by default insecure and therefore garbage. In their mind, no mortal could ever write so much as a single safe function in those languages.
No one thinks this except some strawman that you've devised. No point in reading anything else in this comment when this is so blatantly absurd and detached from reality.
All you have to do is read comments from members of the Rust community online, in every public forum where Rust is discussed in any way.
Understand, I am not trying to villainize an entire community of software developers; but for you to say something that's blatantly false is to just stick your head in the sand.
You should try and read the words people write. Opinions are not formed in a vacuum.
Edit: to be clear- I have no problems with Rust the language beyond some ergonomic concerns. I am not a Rust hater, nor am I a zealot. I do advocate for C# a lot for application code though. But I do not deride others' language preferences. You should not dismiss my observations because I used hyperbole. Obviously not every Rust dev thinks you can't write a secure C/C++ function; don't pick out the one hyperbolic statement to discredit my entire post. Bad form.
> The audience member points out that they shouldn't encode the semantics into the Rust type system because that would mean that refactoring the C code breaks Rust, which is not an acceptable situation. The speaker responds to this by saying essentially "tell me what the semantics are and I'll encode them in the Rust type system." That's maximally missing the point.
You have to encode your API semantics somewhere.
Either you encode them at the type system and find out when it compiles, or you encode it at runtime, and find out when it crashes (or worse, fails silently).
I disagree, they didn't straight out pointed this, because this is nonsense. Semantic changes can break anything, even if it's some intermediary API.
There are more breakage in rust due to the type-system-related semantics, but ideally a C dev would also want their system to break if the semantics aren't right. So this is a criticism on C..?
So following this argument, they don't want Rust because C falls short? Nonsense.
edit: The speaker did mention that they didn't want to force limited use on the base APIs, but that for a great deal of their usage, they could have determined fixed semantics, and make intermediary APIs for it. So this was not about limiting the basic APIs.
Here are the software requirements (inferred from the commenter):
- (1) the C code will be refactored periodically
- (2) when refactored internally it can break C code, but the change author should fix any breaking in C
- (3) Rust must not break when (1) happens
It's the Rust devs' job to meet those requirements if they want to contribute. It looks in the video like they don't understand this, which is pretty basic.
> You can literally shove facts in someone's face, and they won't admit to being wrong or misunderstand, and instead continue to argue against some points whose premise isn't even true.
I think that's part of the gag.
"These people are members of a community who care about where they live... So what I hear is people caring very loudly at me." -- Leslie Knope
Its quite a well-known wisdom. I think someone in one of Nintendo or Sony's studios has said it too, in the form of: a complaint is worth twice a compliment.
Satisfied customers will tell you they think your stuff is great, but dissatisfied customers will be able to hone in on exactly where the problem is.
You can even extend this to personal life: if someone tells you your shabby car doesn't fit with the nice suits you wear, you can either take it as a personal attack and get irritated, or take it as feedback and wash your car, spruce up the upholstery and replace the missing wheel cap. In effect they helped you take note of something.
One does not "hone in" on anything. To hone a thing is to make it sharper or more acute by removing parts of it with an abrasive. The word you are looking for is "home", as in a homing missile, etc.
Yes, this is a criticism. Hopefully it's twice as effective as being nice. 8)
Multiple dictionaries recognize the usage of "hone in" to mean "sharpening" your focus on something rather than "home in" which is to move towards something.
I went down a slight rabbit hole for this: apparently both are correct, although "hone in" doesn't seem to have a ground source and has gotten institutionalized in our lexicon over time.
By the way, I don't mind the nit at all! English is not my first language and I slip up occasionally, so refreshers are welcome :-)
You knew what they meant, which is clear if you’re able to correct the use of language accurately. This isn’t a criticism per se, but an acknowledgment that language evolves and part of the way it does that is acceptance that “incorrect” usage, once common enough, is seldom reversed.
> dissatisfied customers will be able to hone in on exactly where the problem is
This sounds like a truism, when it isn't. The client may know something is wrong, but good luck at them identifying it. Some times, the client will convince themselves that something is wrong when it isn't. There were people complaining about lag in WoW, they responded by cutting the latency number in half... except that it wasn't cut in half, it was just measured as time to server rather than roundtrip. The complains died out immediately and they were hailed as "very savvy developers that listen to their customers".
> You can literally shove facts in someone's face, and they won't admit to being wrong or misunderstand, and instead continue to argue against some points whose premise isn't even true.
It's called a strawman fallacy, and like all fallacies, it's used because the user is either intellectually lazy and can't be bothered to come up with a proper argument, or there isn't a proper argument and the person they're using it against is right.
If an honest alien says "we don't want to convert humans to our religion" that means you can have whatever religion you want. If a dishonest alien says it, it might mean "we don't want to convert humans because we are going to kill all humans", it's selectively true - they aren't going to convert us - and leaves us to imagine that we can have our own religion. But it's not the whole truth and we actually won't be able to[1].
An honest "no one will be forced to use Rust in the Kernel" would be exactly what it says. A paltering reading could be "we want to make Rust the only language used in the Kernel but you won't be forced to use it because you can quit". i.e. if you are "literally shoving facts in someone's face" and they don't change then they might think you are not telling the whole truth, or are simply lying about your goals.
> Rust has a stability guarantee since 1.0 in 2015. Any backwards incompatibilities are explicitly opt-in through the edition system, or fixing a compiler bug.
Unfortunately OP has a valid point regarding Rust's lack of commitment to backwards compatibility. Rust has a number of things that can break you that are not considered breaking changes. For example, implementing a trait (like Drop) on a type is a breaking change[1] that Rust does not consider to be breaking.
I think we're mixing 2 things here: language backward-compatibility, vs. standard practices about what semver means for Rust libraries. The former is way stronger than the latter.
> language backward-compatibility, vs. standard practices about what semver means
I've read and re-read this several times now and for the life of me I can't understand the hair you're trying to split here. The only reason to do semantic versioning is compatibility...
I assume that they mean that you can use Rust as a language without its standard library. This matters here since the Kernel does not use Rust's standard library as far as I know (only the core module).
I'm not aware of semver breakage in the language.
Another important aspect is that Semver is a social contract, not a mechanical guarantee. The Semver spec dedicates a lot of place to clarify that it's about documented APIs and behaviors, not all visible behavior. Rust has a page where it documents its guarantees for libraries [0].
The failure mentioned above wasnt a case of the language changing behaviour, but rather the addition of a trait impl in the standard library conflicting with a trait impl in a third party crate, causing the build breakage.
The Rust compiler/language has no notion of semver. Saying "Rust is unstable b/c semver blah blah" is a tad imprecise. Semver only matters in the context of judging API changes of a certain library (crate).
> The only reason to do semantic versioning is compatibility
Sure. But "compatibility" needs to be defined precisely. The definition used by the Rust crate ecosystem might be slightly looser than others, but I think it's disingenuous to pretend that other ecosystems don't have footnotes on what "breaking change" means.
> But "compatibility" needs to be defined precisely.
Compatibility is defined precisely! You're definition requires scare quotes. You want to define it "Precisely" so that you can permit incompatible behavior. No one who cares about compatibility does that, it's just an excuse.
Look, other languages do this differently. Those of use using C99 booleans know we need to include a separate header to avoid colliding with the use of "bool" in pre-existing code, etc... And it sort of sucks, but it's a solved problem. I can build K&R code from 1979 on clang. Rust ignored the issue, steamrollered legacy code, and tried to sweep it under the rug with nonsense like this.
I think you are trying very hard to disagree on basic stuff that works very similarly across different language ecosystems, and (looking at other responses) that you're very angry. Disengaging.
I'll point out again that C, the poster child for ancient language technology, has been able to evolve its syntax and feature set with attention to not breaking legacy code. Excusing the lack of such attention via linguistic trickery about "defining compatibility precisely" does indeed kinda irk me. And disengaging doesn't win the argument.
The fundamental issue here is that any kind of inference can have issues on the edges. If you write code using fully qualified paths all the time, then this semver footgun can never occur.
I was hit by a similar thing. Rust once caused regression failures in 5000+ packages due to incompatibility with older "time" packages [1]. It was considered okay. At that point, I don't care what they say about semver.
The comment you linked to explicitly shows that a maintainer does not consider this "okay" at all. T-libs-api made a mistake, the community got enraged, T-libs-api hasn't made such a mistake since. The fact that it happened sucks, but you can't argue that they didn't admit the failure.
The way you word that makes it sound like "the maintainers" and "T-libs-api" do not consider this "okay". Reading just above the linked comment, however, puts a very different impression of the situation:
> We discussed this regression in today's @rust-lang/libs-api team meeting, and agree there's nothing to change on Rust's end. Those repos that have an old version of time in a lockfile will need to update that.
You're reading an artifact of a point in time, before the it hit stable and the rest of the project found out about this. t-libs-api misunderstood the impact because in the past there had been situations that looked similar and were unproblematic to go ahead with, but weren't actually similar. There were follow up conversations, both in public and private, where the consensus arrived was that this was not ok.
What I'm hearing is that the nature of the issue was recognized - that this was a breaking change; but that the magnitude of the change and the scale of the impact of that break was underestimated.
TBH that does not inspire confidence. I would expect that something claiming or aspiring to exhibit good engineering design would, as a matter of principle, avoid any breaking change of any magnitude in updates that are not intended to include breaking changes.
Thanks for clarifying. I took a look as well, and the very first reply confirms your opinion and that of the GP's parent. Plenty of downvotes and comments that come after criticizing the maintainers, "I am not sure how @rust-lang/libs-api can look at 5400 regressions and say "eh, that's fine"."
You are sincere. I believe this is not a cover-up but more of a misunderstanding. Think this way: many people coming to that github thread don't know who are core rust devs but they can clearly see the second commenter is involved. That comment denied this being a major issue and concluded the decision was made as a team. To the public and perhaps some kernel devs, this may be interpreted as the official attitude.
The change itself was very reasonable. They only missed the mark on how that change was introduced. They should have waited with it until the next Rust edition, or at least held back a few releases to give users of the one affected package time to update.
The change was useful, fixing an inconsistency in a commonly used type. The downside was that it broke code in 1 package out of 100,000, and only broke a bit of useless code that was accidentally left in and didn't do anything. One package just needed to delete 6 characters.
Once the new version of Rust was released, they couldn't revert it without risk of breaking new code that may have started relying on the new behavior, so it was reasonable to stick with the one known problem than potentially introduce a bunch of new ones.
But that is not how backwards compatibility works. You do not break user space. And user space is pretty much out of your control! As a provider of a dependency you do not get to play such games with your users. At least not, when those users care about reliability.
That was a mistake and a breakdown in processes that wasn't identified early enough to mitigate the problem. That situation does not represent the self imposed expectations on acceptable breakage, just that we failed to live up to it and by the time it became clearer that the change was problematic it was too late to revert course because then that would have been a breaking change.
Yes: adding a trait to an existing type can cause inference failures. The Into trait fallback, when calling a.into() which gives you back a is particularly prone to it, and I've been working on a lint for it.
TBH that's a level of quality control that probably informs the Linux kernel dev's view of Rust reliability - it's a consideration when evaluating the risk of including that language.
Your comment misunderstands the entire point and risk assessment of what's being talked about.
It's about the overall stability and "contract" of the tooling/platform, not what the tooling can control under it. A great example was already given: It took clang 10 years to be "accepted."
It has nothing to do with the language or its overall characteristics, it's about stability.
Maintaining backward compatibility is hard. I am sympathetic. Nonetheless, if the rust dev team think this is a big deal, then clarify in release notes, write a blog post and make a commitment that regression at this level won't happen again. So far, there is little official response to this event. The top comment in the thread I point to basically thinks this is nothing. It is probably too late do anything for this specific issue but in future it would be good to explain and highlight even minor compatibility issues through the official channel. This will give people more confidence.
> Nonetheless, if the rust dev team think this is a big deal, then clarify in release notes, write a blog post and make a commitment that regression at this level won't happen again. So far, there is little official response to this event.
There was an effort to write such a blog post. I pushed for it. Due to personal reasons (between being offline for a month and then quitting my job) I didn't have the bandwidth to follow up on it. It's on my plate.
> The top comment in the thread I point to basically thinks this is nothing.
I'm in that thread. There are tons of comments by members of the project in that thread making your case.
> It is probably too late do anything for this specific issue but it would be good to explain and highlight even minor compatibility issues through the official channel.
I've been working on a lint to preclude this specific kind of issue from ever happening again (by removing .into() calls that resolve to its receiver's type). I customized the diagnostic to tell people exactly what the solution is. Both of these things should have been in place before stabilization at the very least. That was a fuck up.
It’s hard for me to tell if you’re describing a breakdown in the process for evolving the language or the process for evolving the primary implementation.
Bugs happen, CI/CD pipelines are imperfect, we could always use more lint rules …
But there’s value in keeping the abstract language definition independent of any particular implementation.
> At that point, I don't care what they say about semver.
Semver, or any compatibility scheme, really, is going to have to obey this:
> it is important that this API be clear and precise
—SemVer
Any detectable change being considered breaking is just Hyrum's Law.
(I don't want to speak to this particular instance. It may well be that "I don't feel that this is adequately documented or well-known that Drop isn't considered part of the API" is valid, or arguments that it should be, etc.)
Implementing (or removing) Drop on a type is a breaking change for that type's users, not the language as a whole. And only if you actually write a trait that depends on types directly implementing Drop[0].
Linux breaks internal compatibility far more often than people add or remove Drop implementations from types. There is no stability guarantee for anything other than user-mode ABI.
[0] AFAIK there is code that actually does this, but it's stuff like gc_arena using this in its derive macro to forbid you from putting Drop directly on garbage-collectable types.
> Linux breaks internal compatibility far more often than people add or remove Drop implementations from types. There is no stability guarantee for anything other than user-mode ABI.
I think that's missing the point of the context though. When Linux breaks internal compatibility, that is something the maintainers have control over and can choose not to do. When it happens to the underlying infrastructure the kernel depends on, they don't have a choice in the matter.
Removing impl Drop is like removing a function from your C API (or removing any other trait impl): something library authors have to worry about to maintain backwards compatibility with older versions of their library. A surprising amount of Rust's complexity exists specifically because the language developers take this concern seriously, and try to make things easier for library devs. For example, people complain a lot about the orphan rules but they ensure adding a trait impl is never a breaking change.
Meaning of this code has not changed since Rust 1.0. It wasn't a language change, nor even anything in the standard library. It's just a hack that the poster wanted to work, and realized it won't work (it never worked).
This is equivalent of a C user saying "I'm disappointed that replacing a function with a macro is a breaking change".
Rust had actual changes that broke people's code. For example, any ambiguity in type inference is deliberately an error, because Rust doesn't want to silently change meaning of users' code. At the same time, Rust doesn't promise it won't ever create a type inference ambiguity, because it would make any changes to traits in the standard library almost impossible. It's a problem that happens rarely in practice, can be reliably detected, and is easy to fix when it happens, so Rust chose to exclude it from the stability promise. They've usually handled it well, except recently miscalculated "only one package needed to change code, and they've already released a fix", but forgot to give users enough time to update the package first.
As long as you compile with the version specified (e.g., `-std=c11`) I think backwards compatibility should be 100%. I've been able to compile codebases that are decades old with modern compilers with this.
In practice, C has a couple of significant pitfalls that I've read about.
First is if you compile with `-Werror -Wall` or similar; new compiler diagnostics can result in a build failing. That's easy enough to work around.
Second, nearly any decent-sized C program has undefined behavior, and new compilers may change their handling of undefined behavior. (E.g., they may add new optimizations that detect and exploit undefined behavior that was previously benign.) See, e.g., this post by cryptologist Daniel J. Bernstein: https://groups.google.com/g/boring-crypto/c/48qa1kWignU/m/o8...
Why not entirely wrong, the UB issues are bit exaggerated in my opinion. My C code from 20 years ago still works fine even when using modern compilers. In any case, our plan is to remove most of UB and there is quite good progress. Complaining that your build fails with -Werror seems a bit weird. If you do not want it, why explicitly request this with -Werror?
To be clear: I mean UB in the C standard. The cases where there is UB are mostly spelled out explicitly, so we can go through all the cases and define behavior. There might be cases where there is implicit UB, i.e. something is accidentally not specified, but this has always been fixed when noticed. It is not possible to remove all cases of UB though, but the plan is to add a memory safe mode where there is no UB similar to Rust.
The warning argument is silly. It just means that your code is not up to par with the modern standards. -Wall is a moving goalpost and it's getting new warnings added with every release of a TC because TC developers are trying to make your code more secure.
I mean, yeah, I said it was easy enough to work around. But it's an issue I've seen raised in a discussions of C code maintenance. (The typical conclusion is that using `-Wall -Werror` is a mistake for long-lived, not-actively-developed code.) Apologies if I overstated the case.
Not even close to 100%, the reason that it feels like every major C codebase in industry is pinned to some ancient compiler version is because upgrading to a new toolchain is fraught. The fact that most Rust users are successfully tracking relatively recent versions of the toolchain is a testament to how stable Rust actually is in practice (an upgrade might take you a few minutes per million lines of code).
Try following your favourite distro's bug tracker during GCC upgrade. Practically every update breaks some packages, sometimes less, sometimes more (esp. when GCC changes their default flags).
Every language has breaking changes. The question is the frequency, not if it happens at all.
The C and C++ folks try very hard to minimize breakage, and so do the Rust folks. Rust is far closer to those two than other languages. I'm not willing to say that it's the same, because I do not know how to quantify it.
But you can still use gets() if you're using C89 or C99[1], so backwards compatibility is maintained.
Rust 2015 can still evolve (either by language changes or by std/core changes) and packages can be broken by simply upgrading the compiler version even if they're still targeting Rust 2015. There's a whole RFC[2] on what is and isn't considered a breaking change.
That's not what backwards compatibility means in this context. You're talking about how a compiler is backwards compatible. We're talking about the language itself, and upgrading from one versions of the language to the next.
Rust 2015 is not the same thing as C89, that is true.
> packages can be broken by simply upgrading the compiler version
This is theoretically true, but in practice, this rarely happens. Take the certainly-a-huge-mistake time issue discussed above. I actually got hit by that one, and it took me like ten minutes to even realize that it was the compiler's fault, because upgrading is generally so hassle free. The fix was also about five minutes worth of work. Yes, they should do better, but I find Rust upgrades to be the smoothest of any ecosystem I've ever dealt with, including C and C++ compilers.
(side note: I don't think I've ever thanked you for your many contributions to the Rust ecosystem, so let me do that now: thank you!)
> You're talking about how a compiler is backwards compatible. We're talking about the language itself, and upgrading from one versions of the language to the next.
That's part of the problem. Rust doesn't have a spec. The compiler is the spec. So I don't think we can separate the two in a meaningful way.
C (and C++) code breaks all the time between toolchain versions (I say "toolchain" to include compiler, assembler, linker, libc, etc.). Some common concerns are: headers that include other headers, internal-but-public-looking names, macros that don't work the way people think they do, unusual function-argument combinations, ...
Decades-old codebases tend to work because the toolchain explicitly hard-codes support for the ways they make assumptions not provided by any standard.
For the purposes of linux kernel, there's essentially a custom superset of C that is defined as "right" for linux kernel, and there are maintainers responsible for maintaining it.
While GCC with few basic flag will, in general, produce binary that cooperates with kernel, kbuild does load all those flags for a reason.
Superset. ANSI/ISO C is not a good language to write a kernel in, because the standards are way more limiting than some people would think - and leaves a lot to implementation.
The backwards compatibility guarantee for C is "C99 compilers can compile C99 code". If they can't, that's a compiler bug. Same for other C standards.
Since Rust doesn't have a standard, the guarantee is "whatever the current version of the compiler can compile". To check if they broke anything they compile everything on crates.io (called a crater run).
But if you check results of crater runs, almost every release some crates that compiled in the previous version stop compiling in the new version. But as long as the number of such breakages it not too large, they say "nothing is broken" and push the release.
Can you provide an example for the broken-crater claim? As far as I'm aware, Rust folks don't break compatibility that easily, and the one time that happened recently (an old version of the `time` crate getting broken by a compiler update), there were a lot of foul words thrown around and the maintainers learned their lesson. Are you sure you aren't talking about crates triggering UB or crates with unreliable tests that were broken anyway?
C99 isn't a compiler version. It's a standard. Many versions of GCC, Clang and other compilers can compile C99 code. If you update your compiler from gcc 14.1 to gcc 14.2, both versions can still compile standard code.
There is also a very high level of backwards compatibility between versions of ISO C because there is a gigantic amount of code is updated if there is change. So such changes are done only for important reasons or after a very long deprecation period.
Right, which is basically the opposite of what backwards incompatibility means. Imagine if GCC 14.2.0 was only guaranteed to be able to compile "C 14.2.0".
> "which is actively hostile to a second Rust compiler implementation" - except that isn't true?
Historically the Rust community has been extremely hostile towards gccrs. Many have claimed that the work would be detrimental to Rust as a language since it would split the language in two (despite gccrs constantly claiming they're not trying to do that). I'm not sure if it was an opinion shared by the core team, but if you just browse Reddit and Twitter you would immediately see a bunch of people being outright hostile towards gccrs. I was very happy to see that blog post where the Rust leadership stepped up to endorse it properly.
> Hi folks, because threads on gccrs have gotten detailed in the past, a reminder to please adhere to the subreddit rules by keeping criticism constructive and keeping things in perspective.
The LKML quote is alleging that the upstream language developers (as opposed to random users on Reddit) are opposed to the idea of multiple implementations, which is plainly false, as evidenced by the link to the official blog post celebrating gccrs. Ted T'so is speaking from ignorance here.
I think it’s more pointed towards people like me who do think that gccrs is harmful (I’m not a Rust compiler/language dev - just a random user of the language). I think multiple compiler backends are fine (eg huge fan of rustc_codegen_gcc) but having multiple frontends I think can only hurt the ecosystem looking at how C/C++ have played out vs other languages like Swift, Typescript etc that have retained a single frontend. In the face of rustc_codegen_gcc, I simply see no substantial value add of gccrs to the Rust ecosystem but I see a huge amount of risk in the long term.
> opposed to the idea of multiple implementations, which is plainly false, as evidenced by the link to the official blog post celebrating gccrs. Ted T'so is speaking from ignorance here.
Why use so strong words? Yes, there's clearly a misunderstanding here, but why do we need to use equally negative words towards them? Isn't it more interesting to discuss why they have this impression? Maybe there's something with the communication from the upstream language developers which hasn't been clear enough? It's a blog post which is a few months old so if that's the only signal it's maybe not so strange that they've missed it?
Or maybe they are just actively lying because they have their own agenda. But I don't see how this kind of communication, assuming the worst of the other part, beings us any closer.
I'm not going to mince words here. Ted T'so should know better than to make these sorts of claims, and regardless of where he got the impression from, his confident assertion is trivially refutable, and it's not the job of the Rust project to police whatever incorrect source he's been reading, and they have demonstrably been supportive of the idea of multiple implementations. This wouldn't even be the first alternative compiler! Several Rust compiler contributors have their own compilers that they work on.
The kernel community should demand better from someone in such a position of utmost prominence.
For whatever it's worth, I did believe that some of the Rust team was very hostile towards gccrs, but that behavior has completely changed, and it seems like they're receiving a lot of support these days.
These are often the same classification of individual who tend to modify their viewpoints towards “change is progressing rapidly in an area that I don’t understand and this scares me.” Anytime an expert in a particular area has their expertise challenged or even threatened by a new technology it is perfectly human to react in a way that is defensive towards the perceived threat. Part of growth as a human is recognizing our perceived biases and attempting to mitigate them, hopefully extending this into other areas of our lives as well. After all, NIMBYS probably started out with reasonable justifications for why they want to keep their communities the way they currently are - it’s comfortable and it works, and they’re a significant contributor to the community. Any external “threat” to this concept becomes elevated to a moral crusade against the invaders who are encroaching upon their land, when really they’re jousting against windmills.
I think confronting those volunteers that maintain open-source software with arguments such as "you just do not want to learn new things" , "are scared of change", etc. is very unfair. IMHO new ideas should prove themselves and not pushed through because google wants it or certain enthusiastic groups believe this is the future. If Rust is so much better, people should just build cool stuff and then it will be successful anyway.
Overall, this whole situation seems entirely weird to me. All this stuff such as Unix, Linux, and C ecosystem was build by C programmers and maintained for decades mostly voluntarily, while most of the industry pushed into other directions (with a gigantic influx of money). It is completely amazing that Linux become so successful against all the odds. Certainly it also then had a lot industry support, but I used it already before most of this and witnessed all the development. But somehow, C programmers are now suddenly portrayed as the evil gatekeepers, not stepping aside fast enough, because some want to see change. In the past, the people wanting to see something new in the open-source community would need to convince the community by building better things, not by pushing aggressively into existing projects.
I believe the Rust for Linux project was started by a Linux guy, rather than a Rust guy, and many of the Rust for Linux maintainers have come at this from a perspective of "we are Linux maintainers who want to use Rust" rather than "we are Rust users who want our code to be in Linux".
I think it's important to be wary of simplistic narratives (such as "C vs Rust"). Maintaining a complex piece of software comes with tradeoffs and compromises, and the fewer languages you have to worry about the better. On the other hand, the Asahi Linux team have been quite explicit that without Rust, they wouldn't have achieved a fraction of what they have. So clearly there is a lot of value in RfL for Linux as a whole, if implemented well. And that value is reflected in the decision from Linus that RfL should be supported, at least for now.
> many of the Rust for Linux maintainers have come at this from a perspective of "we are Linux maintainers who want to use Rust" rather than "we are Rust users who want our code to be in Linux".
This might be true, but do you have any actual quantifiable evidence for it? Because FWIW, from what I as an outsider see (mainly in threads like this), all the drama looks very much like "we are Rust users who want our code to be in Linux".
It is entirely unclear to me where the value actually is. It seems google is funding it for some reason. And some people clearly have a lot of opinions that this is "the future". People had similarly strong opinions about various other things in the past.
It's worth reading Asahi Lina's posts about writing a driver in a Linux - she is very explicit that what they've achieved would not have been possible without Rust.
In fairness, this is one team working on one project, but if they're attributing much of their success to Rust, it's probably worth listening to and understanding why, particularly as I don't believe they were particularly evangelistic about Rust before this project.
I have no idea about the Google funding, but Marcan's blog post is very explicit they they do not have any corporate sponsorship. If you believe that to be untrue, please explain your reasoning rather than spreading unsubstantiated rumours.
This isn't the first time a new language is proposed for the kernel though.
At some point there was some brief discussion for C++ in the kernel and that was essentially immediately killed by Linus. And he was essentially right.
Yeah I certainly don’t want to mischaracterize anyone here and I attempted to communicate how this is really a knee-jerk, human reaction to something new making inroads into a space people have extensive expertise in. New ideas additionally shouldn’t be derided based upon the poor behavior of some in the community.
Fair, I acknowledge I may have misrepresented this group who are against the Rust community as not being experts in this this space; they certainly are. Rust doesn’t have to be the answer but if we treat others (namely Rust supporters) and their solutions as dead-on-arrival because it’s implemented in a technology we’re not entirely familiar with how can we get to a point where we’re solving difficult problems? Especially if we create an unwelcoming space for contribution?
May be a poor example, it’s what came to mind initially. I don’t think the end results are at all the same but I think the initial emotions around why you may balk at something new entering your community have parallels to the topic at hand.
> Any backwards incompatibilities are explicitly opt-in through the edition system, or fixing a compiler bug.
This is a very persistent myth, but it’s wrong. Adding any public method to any impl can break BC (because its name might conflict with a user-defined method in a trait), and the Rust project adds methods to standard library impls all the time.
This is true, strictly speaking, but rarely causes problems. Inherent methods are prioritized over trait methods, so this only causes problems if two traits suddenly define a single method, and the method is invoked in an ambiguous context.
This is a rare situation, and std thrives to prevent it. For example, in [1], a certain trait method was called extend_one instead of push for this reason. Crater runs are also used to make sure the breakage is as rare as T-libs-api has expected. The Linux kernel in particular only uses core and not std, which makes this even more unlikely.
That's just not true. If the user has defined on the struct and a trait also has the method name, the struct's impl is used. Multiple traits can have methods named the same too.
My comment might have been technically wrong as originally stated; I’ve since edited to try to correct/clarify.
What I really meant is the case where a method is added to a standard struct impl that conflicts with a user-defined trait.
For example, you might have implemented some trait OptionExt on Option with a method called foo. If now a method called foo is added to the standard option struct, it will conflict.
You are always free to use fully-qualified paths to protect yourself from any change in your dependencies (including std) that would break inference (by making more than one method resolve).
> which is actively hostile to a second Rust compiler implementation
Which is hilarious since Linux itself was actively hostile to the idea of a second C compiler supporting it. Just getting Linux to support Clang instead of only GCC was a monumental task that almost certainly only happened because Android forced it to happen.
Putting in the work is one thing, which is what the Rust-in-Linux people are also doing, but there's also the political requirement to force maintainers to accept it. Android was big enough, and happy enough to fork seeing as it had already done that before, that it forced a lot of hands with top-down mandates.
Rust, despite having Linus' blessing to be in the kernel, is still just getting rejected just because it's Rust, completely unrelated to any technical merits of the code itself.
The "thin blue line" is a term that typically refers to the concept of the police as the line between law-and-order and chaos in society.[1] The "blue" in "thin blue line" refers to the blue color of the uniforms of many police departments.
It's a motto used by American law enforcement to justify extrajudicial punishment. Since they are the "thin blue line" that separates the public from anarchy, they are justified in acting independently to "protect" us when judges and juries do not "cooperate".
Not just extrajudicial punishment, but overlooking corrupt acts and crimes from fellow officers. That it's more important to maintain the 'brotherhood' than to arrest an officer caught driving home drunk.
Directly, "the thin blue line" expresses the idea that the police are what separates society from chaos.
It doesn't inherently suggest police are justified in acting outside the law themselves, though, of course, various people have suggested this (interestingly, from both a pro-police and anti-police perspective).
It seems obvious to me that the post was using this phrase in the sense of being a thin shield from chaos.
That is a very strange take. The phrase isn't American and has no negative connotation. It has nothing to do with "extrajudicial punishment". It simply refers to the (obvious) fact that what separates societies from anarchy is the "thin blue line" of law enforcement.
Rowan Atkinson had a sitcom set in a London police station in the 90s called "The Thin Blue Line". Are you under the impression he was dogwhistling about extrajudicial violence?
This is what really confused me about the article. I read the mailing list post and had no idea what was controversial about thin blue line. In fact, I thought most of that post was fairly reasonable.
I'd never heard of the extrajudicial punishment aspect of the phrase (though I had heard the phrase itself) and it didn't show up when I googled, but I'm not American, so maybe there's some cultural differences.
"Thin blue line" is a popular phrase of the so called "American culture war". During the heyday of the Black Lives Matter movement, it was used as a self-identification by those who did not agree with criticisms of the nations policing and justice systems. A closely related symbol is the Punisher[0] skull from Marvel comics.
All in all, this could just be another instance of the "culture war" inflaming every other minor disagreement with Ted playfully using the phrase and Marcan misinterpreting it. Or it could be Ted slipping up with their politics. From what I know about Marcan and what can be inferred from his post, they do seem like someone the alt-right would persecute.
Wow, hadn't heard of the punisher skull association either! It seems that it hasn't really traveled that much outside of America.
I had a look, and it seems that Ted Ts'o is American, so I guess we should assume he understands the cultural significance of the phrase (even though I didn't).
All the extrajudicial stuff is pure political and ideological wank by a subset of ideological extremists. Pay no attention to any of it. It's an attempt to redefine the term for narrative creation purposes.
In the US, this is a reference to the belief that members of law enforcement should be loyal folirst to other members of law enforcement and only secondarily to the law. Or at least that is how I have always understood it.
It seems obvious that that’s not what Ted intended it to mean, since it wouldn’t even make sense in this context (the debate doesn’t really seem to be about whether maintainers should be loyal to other maintainers).
A more charitable interpretation would be “we’re the only line of defense protecting something good and valuable from the outside world, so people should give significant weight to our opinions and decisions”. Which, to be clear, I would still mostly disagree with WRT the police, but it at least doesn’t explicitly endorse corruption.
The thin blue line comes from the thin red line, where a line of British redcoats held back a heavy cavalry charge in the crimean war. I've always taken it to mean that police officers consider themselves soldiers holding the last line of defence against wild enemies. Which is itself a controversial and probably unhelpful way to think about your job as a police officer.
There are many ways to state that without invoking corruption. I think Ted is telling the truth of who he is by choosing that phrase intentionally - we aren't talking about an idiot who just says stuff, he's a smart guy.
Given that "invoking corruption" is neither the plain meaning of those words, nor does it even make sense in this context, I don't think it's reasonable to claim Ted did so.
Ted Tso is an American, he was born in California, did his schooling in the US, and has worked here most (all?) of his career. As such he can be expected to know that "the thin blue line" is an idiom that carries with it a lot of connotation.
It's perfectly reasonable to assume he was aware of the implications of his words and chose to use them anyway.
I'm American, I was born in Arizona, I did my schooling in the US, and I have worked in the US for all of my career. I disagree with your assertion that "thin blue line" necessarily implies support for corruption.
And by the way, so does Wikipedia: https://en.wikipedia.org/wiki/Thin_blue_line doesn't mention this interpretation at all. The closest thing is this sentence, which is really not saying the same thing at all, and at any rate only presenting it as something "critics argue", rather than the settled meaning of the phrase.
> Critics argue that the "thin blue line" represents an "us versus them" mindset that heightens tensions between officers and citizens and negatively influences police-community interactions by setting police apart from society at large.
And yet I've never seen that phrase used other than when cops are defending their colleague who is on video murdering/raping/beating someone innocent, or by those calling for reform who are criticizing the cops covering for each other's crimes.
Even I have seen it used in other senses by Americans, and I've never been to America. AFAICT it has only acquired that sense, at least to the extent it currently has, after #BLM. Might be an age thing, that most of your cultural impressions are of a more recent date than the majority of mine? (And, say, Ted T'so's.)
And it's in a context where some group of people with special power is acting in bad faith to avoid having to follow the rules, and setting up "us vs them" arguments to do so!
I think you might be mistaking the "thin blue line" concept with the "blue / all lives matter" in this case, thin blue line is neither new nor newly popular with BLM.
Certainly more popular since then; probably swept along by "blue lives matter". Have you seen that black-and-blue version of the American flag, with, what is it, six or seven blue stripes (or lines)? How old is that?
https://en.wikipedia.org/wiki/Thin_blue_line TLDR is it's the idea that the police are the one thing stopping society from instantly dissolving into chaos so they shouldn't be questioned (even when they kneel on someone's neck until they die)
> - "an upstream language community which refuses to make any kind of backwards compatibility guarantees" -> Rust has a stability guarantee since 1.0 in 2015. Any backwards incompatibilities are explicitly opt-in through the edition system, or fixing a compiler bug.
The most charitable interpretation I can imagine is that the Rust-in-Linux project needs specific nightly features, and those don't get stability guarantees. But I think this is still pretty unfair to complain about; my impression is there's a lot of appetite on the Rust side to get those stabilized.
I also think...
> we know, through very bitter experience, that 95+% of the time, once the code is accepted, the engineers which contribute the code will disappear, never to be seen again.
...that while there's truth in this, there's also a large extent to which it's a self-fulfilling prophecy. Someone might want to stick it out to get their work into mainstream once, but then take a look at the process once it's in the mirror and say never again.
...and:
> Instead of complaining about maintainers for who are unreasonably caring about these things, when they are desparately under-resourced to do as good of a job as they industry demands, how about meeting us half-way and helping us with these sort of long-term code health issues?
It's really hard for me to not see "let's actually write down the contract for these functions, ideally via the type system" as doing exactly that. Which seems to me to be the central idea Ted Ts'o was ranting about in that infamous video.
If comments as benign as "thin blue line" causes fragile entryist/activists to flee, I say Ted and the kernel team are doing the right thing. Projects as critical as the Linux kernel shouldn't be battlegrounds for the grievance of the week, nor should they be platforms for proselytizing. Marcan and others like him leave long paths of destruction in their wake. Lots of projects have been turned upsidedown by the drama they seem to bring with them everywhere. The salient point is contributors need to be more than "drive by" submitters for their pet projects. This isn't specific to Rust in the kernel, look at how much of an uphill battle bcachefs was/is.
I didn't even know what the whole issue with the "thin blue line" comment was until I read this thread. I was never under the impression "thin blue line" was about corruption or brutality, I think people are conflating "thin blue line" with "blue lives matter", which is an entirely different subject.
Quite wild to see this being downvoted, because by downvoting, surely one implies the inverse of your post to be the truth, such that projects such as the Linux kernel should be battlegrounds for the grievance of the week, should be platforms for proselytizing, and so forth.
Very strange to see little to no empathy for kernel maintainers in this situation.
Look, I don't know what to say without just assuming you're approaching this discussion in bad faith.
Saying people "compulsively downvote" the stuff above is already a strong claim that you have no way to substantiate. I think more broadly what you're claiming is that the people downvoting you and anonfordays are emotional and doing so out of political zealotry, and... again, that a pretty strong claim.
People can downvote a post not because they strongly disagree with its claims, but because they strongly dislike its inflammatory tone ("fragile entryist", "Marcan and others like him leave long paths of destruction in their wake", etc).
People who strongly disagree with a post don't necessary believe the exact opposite of its claims. They can disagree with some of the claims and agree with others, or disagree with the very framing of the post.
If I say "we should outlaw all guns because gun crimes are awful" and you disagree, that doesn't mean you think gun crimes are great.
> Rust has a stability guarantee since 1.0 in 2015. Any backwards incompatibilities are explicitly opt-in through the edition system, or fixing a compiler bug.
The community is more than just the language and compiler vendor(s). It's everyone using the language, with particular emphasis on the developers of essential libraries and tools that those users use and on which they're reliant.
In this sense, based on every time I've attempted to use Rust (even after 1.0), Ts'o's remark ain't inaccurate from what I can tell. If I had a nickel for every Rust library I've seen that claims to only support Rust Nightly, I'd have... well, a lot of nickels. Same with Rust libraries not caring much about backward-compatibility; like yeah, I get it during pre-1.0, or while hardly anyone's using it, but at some point people are using it and you are signaling that your library's "released", and compatibility-breaking changes after that point make things painful for downstream users.
> Here's the maintainer on the gccrs project (a second Rust compiler implementation), posting on the official Rust Blog
Same deal here. The Rust developers might be welcoming of additional implementations, but the broader community might not be. I don't have enough information to assess whether the Rust community is "actively hostile" to a GCC-based Rust implementation, but from what I can tell there's little enthusiasm about it; the mainstream assumption seems to be that "Rust" and its LLVM-based reference compiler are one and the same. Maybe (hopefully) that'll change.
----
The bigger irony here, in any case, is that the Linux community has both of these very same problems:
- While the kernel itself has strict backwards-compatibility guarantees for applications, the libraries those applications use (including absolutely critical ones like glibc) very much do not. The ha-ha-only-serious observation in the Linux gaming community is that - thanks to Wine/Proton - the Windows API is the most stable ABI for Linux applications. Yeah, a lot of these issues are addressable with containerization, or by static compilation, but it's annoying that either are necessary for Linux-native applications to work on old and new distros alike.
- As marcan alludes to in the article, the Linux community is at least antipathetic (if not "actively hostile") to Linux-compatible kernels that are not Linux, be they forks of Linux (like Android) or independent projects that support running Linux applications (WSL 1/2, FreeBSD, some illumos distros, etc.). The expectation is that things be upstreamed into "the" Linux, and the norms around Linux development make out-of-tree modules less-than-practical. This is of course for good reason (namely: to encourage developers to contribute back to upstream Linux instead of working in silos), but it has its downsides - as marcan experienced firsthand.
I am not a C person, or a kernel level person, I just watch this from the sideline to learn something every now and then (and for the drama). But this exchange is really stunning to me. It seems so blatantly obvious to me that systematically documenting (in code!) and automatically checking semantic information that is required to correctly use an API is a massive win. But I have encountered this type of resistance (by very smart developers building large systems) in my own much smaller and more trivial context. To some degree, the approach seems to be: "If I never write down what I mean precisely, I won't have to explain why I changed things." A more charitable reading of the resistance is: Adding a new place where the semantics are written down (code, documentation and now type system) gives one more way in which they can be out of sync or subtly inconsistent or overly restrictive.
But yeah, my intuitive reaction to the snippet above is just incredulity at the extreme resistance to precisely encoding your assumptions.
Your charitable reading is too charitable. One of the benefits of using types to help guarantee properties of programs (e.g. invariants) is that types do not get out of sync with the code, because they are part of the code, unlike documentation. The language implementation (e.g. the compiler) automatically checks that the types continue to match the rest of the code, in order to catch problems as early as possible.
I'm not a kernel developer, and never done anything of the sorts either. But, I think the argument is that if they have two versions of something (the C version + the Rust bindings), the logic/behavior/"semantics" of the C version would need to be encoded into the Rust types, and if a C-only developer changes the C version only, how are they supposed to proceed with updating the Rust bindings if they don't want to write Rust?
At least that's my understanding from the outside, someone please do correct me if wrong.
Rust developers were saying it would be their job to do this. But then someone said Linus rejected something because it broke Rust. GKH backed the Rust developers and said that was an exception not a rule, but didn't know Linus' stance for sure.
Then Linus chimes in because of one of Hector's replies, but at the time of my reading did not clarify what his actual stance is here.
Yeah it's not an easy discussion for sure, but he has to say something.
At the rate we're going here the existing kernel devs will alienate any capable new blood, and Linux will eventually become Google Linux(TM) as the old guard goes into retirement and the only possible way forward is through money.
You’re not wrong (in that I was insinuating something like that), but I’ll point out it’s an almost equally big assumption that we’re somehow going to find a trove of capable developers interested in devoting their careers to coding in ancient versions of C.
Do you really think there are no young people wanting to work on an operating system written in C? I'm very skeptical that all young people interested in operating systems see Rust as the future. I personally feel it's the other way around, it's Google and companies like that who really want Rust in Linux, the young kernel devs are a minority.
It's not that people think that there are no young people wanting to work in C, it's that the number of competent programmers who want to use C, or do use C, are both decreasing every year. That has been the trend for quite a while now.
So there will presumably be fewer and fewer programmers, young or old, that want to work in C.
C is one of the most entrenched and still-important languages in the world, so it probably has more staying power than Fortran, COBOL, etc. So the timeline is anybody's guess, but the trajectory is pretty clear.
There are a lot of languages that people prefer to C which aren't well-suited to OS programming (golang, Java) but Rust is one that can do the same job as C, and is increasingly popular, and famously well-loved by its users.
There's no guarantee that Rust will work out for Linux. Looks unlikely, to me, actually. But I think it's pretty clear that Linux will face a dwindling talent pool if the nebulous powers that actually control it collectively reject everything that is not C.
> how are they supposed to proceed with updating the Rust bindings if they don't want to write Rust?
If I've interpreted it correctly (and probably not, given the arguments), Linus won't accept merge requests if they break the Rust code, so the maintainer would need to reach out to the Rust for Linux (or someone else) to fix it if they didn't want to themselves.
And some lead maintainers don't want to have to do that, so said no Rust in their subsystem.
Which is a moot point because the agreement right now is that Rust code is allowed to break, so the C developer in question can just ignore Rust, and a Rust person will take care of it for them.
> Then I think we need a clear statement from Linus how he will be working. If he is build testing rust or not.
> Without that I don't think the Rust team should be saying "any changes on the C side rests entirely on the Rust side's shoulders".
> It is clearly not the process if Linus is build testing rust and rejecting PRs that fail to build.
For clarity, tree-wide fixes for C in the kernel are automated via Coccinelle. Coccinelle for Rust is constantly unstable and broken which is why manual fixes are required. Does this help to explain the burden that C developers are facing because of Rust and how it is in addition to their existing workloads?
> Which is a moot point because the agreement right now is that Rust code is allowed to break, so the C developer in question can just ignore Rust
So then the argument that even semantics encoded in the Rust types, can be out of the date compared to the actual code, is actually a real thing? I read that somewhere else here in the comments, but didn't understand how the types could ever be out-of-date, but this would explain that argument.
That's exactly what would happen "types get out of date". I'm not sure what you are familiar with. But imagine in python a new version of a library is released that now has an extra required argument on a function.
As I understand it everything Rust is seen as "optional", so a CONFIG_RUST=n build that succeeds means a-OK, then some Rust person will do a CONFIG_RUST=y build, see it's broken, fix it, and submit a patch.
I may be wrong, but that's how I understood it, but who knows how Linus will handle any given situation. ¯\_(ツ)_/¯
Yes, but generic code complicates the picture. The things I saw were like: The documentation says you need a number but actually all you need is for the + operator to be defined. So if your interface only accepts numbers it is unnecessarily restrictive.
Conversely some codepath might use * but that is not in the interface, so your generic code works for numbers but fails for other types that should work.
> Yes, but generic code complicates the picture. The things I saw were like: The documentation says you need a number but actually all you need is for the + operator to be defined. So if your interface only accepts numbers it is unnecessarily restrictive.
if you really need a number, why not use a type specifically aligned to that (something like f32|f64|i32|i64 etc...) instead of relying on + operator definition?
> Conversely some codepath might use * but that is not in the interface, so your generic code works for numbers but fails for other types that should work.
do we agree that if it's not in the interface you are not supposed to use it? conversely if you want to use it, the interface has to be extended?
For the first case you have it the wrong way around. My generic code would work on things that are not numbers but I prevent you from calling it because I didn't anticipate that there would be things you can add that are not numbers. (Better example: require an array when you really only need an iterable).
It's practically impossible to document all your assumptions in the type system. Attempting to do so results in code that is harder to read and write.
You have a choice between code that statically asserts all assumptions in the type system but doesn't exist, is slow, or a pain to work with, and code that is beautiful, obvious, performant, but does contain the occasional bug.
I am not against static safety, but there are trade offs. And types are often not the best way to achieve static safety.
> And types are often not the best way to achieve static safety.
That’s a sort of weird statement to make without reference to any particular programming language. Types are an amazing way to achieve static safety.
The question of how much safety you can reasonably achieve using types varies wildly between languages. C’s types are pretty useless for lots of reasons - like the fact that all C pointers are nullable. But moving from C to C++ to Rust to Haskell to ADA gives you ever more compile time expressivity. That type expressivity directly translates into reduced bug density. I’ve been writing rust for years, and I’m still blown away by how often my code works correctly the first time I run it. Yesterday the typescript compiler (technically esbuild) caught an infinite loop in my code at compile time. Wow!
I’d agree that every language has a sweet spot. Most languages let you do backflips in your code to get a little more control at compile time at the expense of readability. For example, C has an endless list of obscure __compiler_directives that do all sorts of things. Rust has types like NonZeroUsize - which seem like a good idea until you try it out. It’s a good idea, but the ergonomics are horrible.
But types can - and will - take you incredibly far. Structs are a large part of what separates C from assembler. And types are what separates rust from C. Like sum types. Just amazing.
Encoding assumptions and invariants in the type system is a spectrum. Rust, by it's very nature, places you quite far along that spectrum immediately. One should consider if the correctness achieved by this is worth the extra work. However, if there is one place where correctness is paramount, surely it's the Linux Kernel.
> [..]Attempting to do so results in code that is harder to read and write.
> You have a choice between code that statically asserts all assumptions in the type system but doesn't exist, is slow, or a pain to work with, and code that is beautiful, obvious, performant, but does contain the occasional bug.
I don't think you are expressing objective truth, this is all rather subjective. I find code that encodes many assumptions in the type system beautiful and obvious. In part this is due to familiarity, of course something like this will seem inscrutable to someone who doesn't know Rust, in the same way that C looks inscrutable to someone who doesn't know any programming.
> Encoding assumptions and invariants in the type system is a spectrum. Rust, by it's very nature, places you quite far along that spectrum immediately.
Compared to, say, dependent type systems, Rust really isn't that far along. The Linux kernel has lots of static analyzers, and then auxiliary typedefs, Sparse, and sanitizers cover a significant area of checks in an ad-hoc way. All Rust does is formalize them and bring them together.
And getting Rust into the kernel slowly, subsystem by subsystem, means that the formalization process doesn't have to be disruptive and all-or-nothing.
But if the info is info the user of your code needs in order to interface correctly, the point that you can't document everything is moot. You already have to document this in the documentation anyways.
> It makes sense to be extremely adversarial about accepting code because they're on the hook for maintaining it after that. They have maximum leverage at review time, and 0 leverage after.
I don't follow. The one with zero leverage is the contributor, no? They have to beg and plead with the maintainers to get anything done. Whereas the maintainers can yank code out at any time, at least before when the code makes it into an official stable release. (Which they can control - if they're not sure, they can disable the code to delay the release as long as they want.)
That's useful context because as a complete laymen I thought his message was largely reasonable (albeit I am not unsympathetic to the frustration of being on the other side)!
Here is what I learned the hard way: the request sounds reasonable. And that doesn’t matter (sucks, I know.)
Here is the only thing that matters in the end (I learned this an even harder way. I really worked like the L4R people approach this and was bitten by counter-examples left, right, and center): The Linux Kernel has to work. This is even more important than knowing why it works. There is gray area and you only move forward by rejecting anything that doesn’t have at least ten years of this kind of backwards compatible commitment. All of it. Wholesale. (And yes, this blatantly and callously disregards many gods efforts sounding like the tenuous and entitled claim "not good enough".)
But it’s the only thing that has a good chance of working.
Saying that gravity is a thing is not the same attitude as liking that everyone is subject to gravity. But hoping that gravity just goes away this once is wishful thinking of the least productive kind.
Rust is not "sufficiently committed" to backwards compatibility. Firstly, too young to know for sure and the burden is solely on "the rust community" here. (Yes, that sucks. Been there.)
Secondly, there were changes (other posters mentioned "Drop") and how cargo is treated that counter indicate this.
Rust can prove all the haters wrong. They will then be even more revered that Linux and Debian. But they have to prove this. That is a time consuming slog. With destructive friction all the way.
Can I say that I was immediately put off by the author conflating the "thin blue line" quote from with a political orientation?
The full quote (from the article) being: "Later in that thread, another major maintainer unironically stated “We are the ‘thin blue line’”, and nobody cared, which just further confirmed to me that I don’t want to have anything to do with them."
The way I read it, "thin blue line" is being used as a figure of speech. I get what they are referring to and I don't see an endorsement. It doesn't necessarily means a right-wing affiliation or sympathy.
To me it seems like the author is projecting a right-wing affiliation and a political connotation where there is none (at least not officially, as far as I can see on https://thunk.org/tytso/) in order to discredit Theodore Ts'o. Which is a low point, because attacking Ts'o on a personal level means Martin is out of ammunitions to back their arguments.
But then again, Hector Martin is the same person that though that brigading and shaming on social media is an acceptable approach to collaboration in the open source space:
"If shaming on social media does not work, then tell me what does, because I'm out of ideas."
To me, from outside, Hector Martin looks like a technically talented but otherwise toxic person that is trying to use public shaming on social media and ranting on his blog as tools and tactics to impose their will and force the otherwise democratic process of development the linux kernel. And the on top of everything it's behaving like a victim.
It's a good thing they are resigning, in my opinion.
Thank you for pointing this out—willfully, uncharitably misinterpreting “thin blue line” as used by Ts'o demonstrates a severe lack of empathy for people in his position.
Jumping to conclusions about police brutality and so forth (as many here in the comments are doing) is very frustrating to see, because, in context, the intent of his phrasing is very clear to anyone who doesn't needlessly infer Contemporary Political Nonsense in literally everything they read.
Perhaps merge requests should have to go through a process of learning the codebase first, and submitting increasingly more complex fixes before jumping to really complex requests.
It can be hard when solving your own acute issue - doing so doesn't mean it is the only fix or the one the project should accept.
Even if it's beneath someone's talent to have to do it, it is an exercise of community building.
> This is par for the course I guess, and what exhausts folks like marcan. I wouldn't want to work with someone like Ted Tso'o, who clearly has a penchant for flame wars and isn't interested in being truthful.
I am acquainted with Ted via the open source community, we have each other on multiple social media networks, and I think he's a really great person. That said, I also recognize when he gets into flame wars with other people in the open source social circles, and sometimes those other people are also friends or acquaintances.
I can think of many times Ted was overly hyperbolic, but he was ultimately correct. Here is the part of the Linux project I don't like sometimes, which was recently described well in this recent thread. Being correct, or at least being subjectively correct by having extremely persuasive arguments, yet being toxic... is still toxic and unacceptable. There are a bazillion geniuses out there, and being smart is not good enough anymore in the open source world, one has to overcome those toxic "on the spectrum" tendencies or whatever, and be polite while making reasonable points. This policy extends to conduct as well as words written in email/chat threads. Ted is one of those, along side Linus himself, who has in the past indulged into a bit of shady conduct or remarks, but their arguments are usually compelling.
I personally think of these threads in a way related to calculus of infinitesimals, using the "Standard Parts" function to zero away hyperbolic remarks the same way the math function zeros away infinitesimals from real numbers, sorta leaving the real remarks. This is a problem, because it's people like me, arguably the reasonable people, who through our silence enable these kind of behaviours.
I personally think Ted is more right than wrong, most of the time. We do disagree sometimes though, for example Ted hates the new MiB/KiB system of base-2 units, and for whatever reasons like the previous more ambiguous system of confusingly mixed base-10/base-2 units of MB/Mb/mb/KB/Kb/kb... and I totally got his arguments that a new standard makes something confusing already even more confusing, or something like that. Meh...
> Ted hates the new MiB/KiB system of base-2 units, and for whatever reasons like the previous more ambiguous system of confusingly mixed base-10/base-2 units of MB/Mb/mb/KB/Kb/kb
Here's my best argument for the binary prefixes: Say you have a cryptographic cipher algorithm that processes 1 byte per clock cycle. Your CPU is 4 GHz. At what rate can your algorithm process data? It's 4 GB/s, not 4 GiB/s.
This stuff happens in telecom all the time. You have DSL and coaxial network connections quantified in bits per second per hertz. If you have megahertz of bandwidth at your disposal, then you have megabits per second of data transfer - not mebibits per second.
Another one: You buy a 16 GB (real GB) flash drive. You have 16 GiB of RAM. Oops, you can't dump your RAM to flash to hibernate, because 16 GiB > 16 GB so it won't fit.
Clarity is important. The lack of clarity is how hundreds of years ago, every town had their own definition of a pound and a yard, and trade was filled with deception. Or even look at today with the multiple definitions of a ton, and also a US gallon versus a UK gallon. I stand by the fact that overloading kilo- to mean 1024 is the original sin.
> Another one: You buy a 16 GB (real GB) flash drive. You have 16 GiB of RAM. Oops, you can't dump your RAM to flash to hibernate, because 16 GiB > 16 GB so it won't fit.
Right but the problem here is that RAM is produced in different units than storage. It seems strictly worse if your 16GB of RAM doesn't fit in your 16GB of storage because you didn't study the historical marketing practices of these two industries, than if your 16 GiB of RAM doesn't fit in your 16 GB of storage because at least in the second case you have something to tip you off to the fact that they're not using the same units .
> I can think of many times Ted was overly hyperbolic, but he was ultimately correct. Here is the part of the Linux project I don't like sometimes, which was recently described well in this recent thread. Being correct, or at least being subjectively correct by having extremely persuasive arguments, yet being toxic... is still toxic and unacceptable.
I want to say that I am thankful in this world that I am a truly anonymous nobody who writes codes for closed-source mega corp CRUD apps. Being a tech "public figure" (Bryan Cantrill calls it "nerd famous") sounds absolutely awful. Every little thing that you wrote on the Internet in the last 30 years is permanently recorded (!!!), then picked apart by every Tom, Dick, Harry, and Internet rando. My ego could never survive such a beating. And, yet, here we are in 2025, where Ted T'so continues to maintain a small mountain of file system code that makes the Linux world go "brrr".
Hot take: Do you really think you could have done better over a 30 year period? I can only answer for myself: Absolutely fucking not.
I, for one, am deeply thankful for all of Ted's hard work on Linux file systems.
There are plenty of "nerd famous" people who manage it by just not being an asshole. If you're already an asshole being "nerd famous" is going to be rough, yes, but maybe just don't be one?
I understand the challenges he's facing. But I think he put a bit too much of himself into the project. So everything is personal. When someone asks for functionality "why no thund3rb01t OmG" you should be able to say without taking it personally "driver will take time or a volunteer".
> I miss having free time where I can relax and not worry about the features we haven’t shipped yet. I miss making music. I miss attending jam sessions. I miss going out for dinner with my friends and family and not having to worry about how much we haven’t upstreamed. I miss being able to sit down and play a game or watch a movie without feeling guilty.
Honestly I think working like 10+ hour days and not doing other things that are less stressful and enjoyable (people being their biggest stressor in this regard).
They likely have PTSD at this point.
Whatever you need Marcan. I hope you find it. I'm rooting for your health and happiness.
PTSD seems a bit of a stretch in my (perhaps uneducated opinion), but he's definitely been under a pile of stress that he's internalizing more than he should be.
It seems like there's a balancing act between the benefits of writing drivers in Rust (easier, more maintainable), and getting those drivers mainlined (apparently soul-destroying, morale killing), I wonder if the Asahi team is considering simply abandoning linux in favor of something more rust friendly (redox being an obvious candidate, but maybe one of the BSDs?). Given the narrow set of hardware they're aiming to support and that they're writing many of their own drivers _anyway_ (and so are not relying as much on the large # of existing linux drivers), that approach might be more viable. I'd be surprised if the Asahi GPU work wasn't the largest problem by far that their team faces, and as such it would make sense to choose a kernel that lowers the difficulty on that aspect to the greatest degree possible.
The goal of Asahi Linux is to create a Linux distribution that is compatible with Apple devices. Using Rust is not a goal of the project, it's just something they decided to use due to personal preference, and is making the process of upstreaming anything much harder. If anything, it works against them in achieving their goal. Abandoning Rust is a possibility, abandoning Linux is not.
> Abandoning Rust is a possibility, abandoning Linux is not.
The Asahi developers have repeatedly and publicly asserted that were it not for Rust they would not have been able to achieve the level of quality required for the project, at the speed they did, with as small of a team as they have. From the article:
> Rust is the entire reason our GPU driver was able to succeed in the time it did.
Rust is just a better and more productive language than C (I guess this is a subjective statement, but obviously they would think so and I would agree with them).
Nobody ever claimed that it's impossible to write these drivers in C -- C is "Rust-complete" in the sense that you could in theory write a compiler that translates any Rust program to C.
They're just claiming that Rust allowed them to write much higher-quality code, much faster, which seems plausible.
I think neither is a possibilty: there is zero appetite for rewrite what they've written in rust in C, I think the most likely result of it not being upstreamed is it becomes a long-lived fork.
Its sad that they chose Apple, instead of like investing time into the upcoming ARM laptops to make the Linux more optimized on them. That talent should not be wasted on tech jewelry.
The MBP is the best laptop hardware that exists on the market, by far. Why wouldn't someone who prefers Linux over macOS want to run Linux on it?
The existence of other ARM laptops is irrelevant; the reason MBPs are so good has little to do with ARM. Yes x86 makes the processor frontend more complicated but this doesn't make a big enough difference to come close to accounting for how much better the MBP is than its competitors. I would guess the biggest factors are Apple's ability to buy the entire run of TSMC's best process node, and the fact that they have a high level of competence at designing CPU cores and other hardware. The instruction set the core uses is just not that important in comparison.
>The MBP is the best laptop hardware that exists on the market, by far.
Really?
What is so great about a locked down hardware, locked down software machine, that phones home to Apple all the time?
The only reason to get Macs is if you have a niche case of needing long battery life (most people don't, even if they say they do), but this is where the other ARM laptops are gonna also be good, without all the proprietary crap.
I expect you are rage-baiting, but just in case you are not...
Even if you consider the hardware "tech jewelry", isn't it strictly better to have a way to run Linux on it instead of sending it to landfill? Seems silly to exclude a particular set of hardware from consideration for arbitrary reasons?
Not rage bating. Im legitimately suprised by how many tech people hype up the MBP for no reason what so ever. If the laptops were half the price, they would be worth it from a tech perspective considering what you get.
>isn't it strictly better to have a way to run Linux on it
In a perfect world, Apple would open source the firmware, which would let people just compile the linux driver for it. While Asahi project is cool in terms of figuring stuff out, ultimately its a lost cause because Apple will never be on board.
> Given the narrow set of hardware they're aiming to support and that they're writing many of their own drivers _anyway_ (and so are not relying as much on the large # of existing linux drivers), that approach might be more viable.
They are relying heavily on mesa. I'd also assume that GNU stuff is also pretty essential.
Perhaps Android would be possible? It has a HAL that might be easier to work with than the raw linux kernel. The android devs have put in a lot of effort to make downstream driver development not painful. With android, they'd also still have GNU stuff available.
The big issue is non-linux will mean every single open source tool may have a compatibility problem. You also end up dumping a huge amount of capabilities (like running docker containers).
Android would still come with the kernel development caveats which is where Asahi is having the most trouble. Android's HALs help abstract the userspace portion of drivers, but if you need to be in kernel space you're still stuck dealing with Linux. You could stick to just doing forks of the LTS releases, but then you're choosing between less-frequent-but-bigger merge conflicts every couple years vs. small-but-constant merge conflicts continuously.
Isn't mesa portable? Or are there parts that are OS-specific?
> With android, they'd also still have GNU stuff available.
I don't follow; Android is a non-GNU Linux distro. Or do you mean that being on Linux makes GNU stuff easy? (But then, GNU runs happily on BSDs and other unix-likes)
> Isn't mesa portable? Or are there parts that are OS-specific?
Even the OS-specific parts are at least permissively-licensed. OpenBSD is about as religious about "all new code must be under an ISC-compatible license" as it gets, and even they pull in Linux DRM/Mesa code for hardware graphics acceleration: https://man.openbsd.org/drm.7
> Isn't mesa portable? Or are there parts that are OS-specific?
IDK. I'm not familiar with mesa enough to know how portable it is. That said, I do know that it's primarily deployed on linux. An issue with portability is simply that when big projects like mesa are developed, non-linux environments are rarely developed (No clue, for example, if you can build mesa for BSD).
> Or do you mean that being on Linux makes GNU stuff easy?
Mostly this. I don't think, for example, those GNU tools will port over to redox. Building them targeting android is a snap.
> I wonder if the Asahi team is considering simply abandoning linux in favor of something more rust friendly
The entire point of Asahi is to run Linux on macOS (edit: on Mac hardware, not macOS). If they did what you’re suggesting it would be a completely different project.
Well, I wonder if this is a good time for people to reconsider what they actually want out of Asahi. Things that I'm sure are on the list are open source, able to run the tools they want (standard gnu userland?), docker, maybe gnome/kde? I am not convinced that the linux kernel specifically is on that list.
Redox would probably be the best option. The BSDs, it would probably be an uphill fight to. I believe it's been floated, but no movement on incorporating Rust in the BSD kernels. So I they would have to start form scratch. The benefit of Linux in this case is, the Asahi team isn't single handily doing all the Rust in the Linux kernel, right. There are other Rust people and Rust for Linux was already getting somewhat bootstrapped before the Asahi project. With the BSDs, you would have to start with bootstrapping Rust in the kernels and build systems.
FreeBSD may be open to it? It's been awhile, and I haven't kept up to date on it for a year or two. But once again, I think you'd have to start from scratch. So everything for R4L that was built before Asahi Linux needs to be done on the FreeBSD side.
NetBSD is probably a no go. NetBSD supports architectures that Rust (due to LLVM) can't support. Which means it is most likely a no go for NetBSD, NetBSD's schtick is that it can run on anything and they will fully do everything in their power to make sure NetBSD can run on any hardware and be maintained. Hardware portability matters for them.
The attitude I've seen from OpenBSD devs is, the answer is to 'git gud' at C and, not replace C code with Rust. Or in other words, they have no interest in Rust in the OpenBSD kernel.
I don't really know where DragonFlyBSD falls in this. Its the BSD I know the least about.
> For a long time, well after we had a stable release, people kept claiming Asahi Linux and Fedora Asahi Remix in particular were “alpha” and “unstable” and “not suitable for a daily driver”
This is still my position on Asahi Linux: that it is not something that I would use as a daily driver nor recommend to others for use as a daily driver.
> “When is Thunderbolt coming?” “Asahi is useless to me until I can use monitors over USB-C” “The battery life sucks compared to macOS” (nobody ever complained when compared to x86 laptops…) “I can’t even check my CPU temperature” (yes, I seriously got that one).
These would be dealbreakers for me, too. To be clear, I am not saying that it is anyone's job to fix these issues for me. And this isn't meant as an attack on the Asahi Linux team - I think it's incredible what they have been able to do.
But those comments, without any larger context to demonstrate harassment or anything like that, just don't seem too bad to me. The language could be softened a bit, sure, but the criticisms themselves resonate with me and would be valid reasons to not use Asahi Linux IMO.
I feel like Asahi is pretty nice as a lightweight daily driver for web browsing, web development, and stuff like that. Obviously, heavy duty tasks like gaming, CAD, photo/video editing, etc are not quite there yet. I bought an M2 Macbook Air and run Asahi Linux full time and it's surprisingly smooth and bug-free (even smoother than my Arch Linux desktop tbh).
i don't think anyone on the asahi team doesn't know that there are missing functionalities. if they're dealbreakers for anyone, fine.
what's out of line is incessant reporting (via issues, emails, whatever) of what you consider a dealbreaker. that's my impression of what he's complaining about. let the people work. no one likes to respond "not yet" a billion times.
Is it incessant? I would expect the average Asahi Linux user to know how to search GitHub issues and just upvote a request for a certain feature. If many people are creating noise that isn't great, however understanding which functionalities people want most is helpful for deciding priorities.
God bless. Asahi introduced me to fedora/gnome, it feels rock solid on my m1 MacBook, and it's now my daily driver on a 2014 Intel Mac mini
Wishing I had donated before, I'll sign up for opencollective now. I can only imagine the anticlimactic nature of releasing the emulation stack for gaming [0] and not seeing any increase in interest financially. One wonders what funding might have made it more worthwhile than simply passing the hat.
OT, Have you had any issues with WiFi setup with this Mac Mini? I tried multiple distributions and most of them have troubles with detecting WiFi chip on 2014 mini.
Oh that could be, I just have it hardwired. This machine was basically a dumpster dive and was sitting in my closet for the last few years til I found out it had USB3 and gigabit ethernet, did a SSD swap and its working great, but indeed I get "No Wi-Fi Adapter Found"
Open source can be brutal, especially with larger and well established projects.
I contribute to several projects as a well recognized person in my field, not at their scale, but everything they say rings true.
Established developers often push back extremely hard on anything new, until and unless it aligns with their current goals. I’ve had maintainers shut me down without hearing out the merits, only to come back a year later when whatever company they work for suddenly sees it as important.
Project leads who will shift goalposts to avoid confronting the clear hostility their deputies show.
I’ve had OSS users call me personal number, or harass me over email for not having their pet interest prioritized over everything else. Often that’s because I’m blocked by the maintainers.
Open source can be extremely brutal and it’s a battle of stamina and politics as much as it’s one of technical merit.
This is not something specific to open source. Unfortunately if you want to be well-known person who works on well-known project you must either ignore all the shit thrown at you altogether or you must be very very resilient. When you react to attacks on internet you will be attacked, often.
And while I appreciate Marcan's work a lot he is also partically responsible because he himself often jumped on bandwagon attacking other people exactly the same way.
There’s a significant difference between open source and proprietary software.
With proprietary software you usually have a corporate mandate, a goal etc to achieve. Any new tech is achieved as part of that drive. You can get people on board or not based on that, and once you’ve decided, there is someone to answer to if you can’t deliver.
Open source doesn’t have that. A project can go in twenty different directions at once, you can say you all agree to something and then have people sabotage it without being answerable to anyone.
Does that make open source worse? No. It’s the trade off for being open, which is extremely valuable but it is a very different push in terms of a product.
For me it wasn't about open source vs proprietary software. I just wanted to say that in areas like game development, online entertainment or just really anything that require interacting with big communities of people on internet there is no way to avoid attacks on yourself.
So leading well known open source project is politics on a small scale and there will be a lot of people who want to hurt or manipulate you.
If you decide to become a public person and want to have fans and supporters then be ready to have haters as well.
> primarily due to the very large fraction of entitled users
I think anyone working in serious open source projects just need to learn to ignore those users. I definitely would have the attitude of "I'm perfectly fine if no one uses my product" and have a lot of fun banning entitled users left and right.
I see a lot of FOSS maintainers continue to engage and defend themselves against people who have demonstrated themselves as unwilling to contribute in any way, yet expecting that free work be done for them. I wish more open source devs will keep in mind that FOSS work is a gift they're sending out into the world, and it's a common good that anyone can contribute to. That is not to say ignore all criticism or user requests, just that you hold absolutely no responsibility to placate emotionally draining people - the project is just as much their responsibility as yours.
Yeah, 100%. Large successful FOSS eventually needs someone like Linus who can just brush away arguments that can bog down the project. It's almost like a military operation -- armies need to move forward instead of bogging down. And in the case of Linux that's perhaps even more true, because the world literally runs on it.
I don't agree with Linus all the time (mostly because I don't have the technical knowledge to agree with), but I 100% agree to his attitude. I hope other large FOSS project maintainers have the same mindset.
Easier said than done. If you care about users, and then realize most of them are jerks, it’s deflating. Maybe the secret is to not care about users, but the risk is that you end up doing self-gratifying work that leads nowhere.
Not exactly the same but being a mod made me realize the same, people will 100% treat you like a whole other being in a dehumanizing way, if you're just a passionate person about the topic you get whiplash at first.
Yeah, I agree. Maybe that's why I'm not doing great work -- I simply don't care about users. But again, you don't have to care about all users. You only need to care about the ones that have the same mindset. That's why I think working in smaller communities is better. Asahi Linux and Rust in Linux are the worst projects in that perspective because they both try to touch too many users.
But again, maybe they can hire someone like me, whose sole job is to block the very worst entitled users.
To me, unless there is an exchange of money and a signed statement of work, I have absolutely zero entitlement as a user. And the maintainers are well within their rights (by law and social contract) to tell me to go away. I wish this was a more adopted mindset.
It’s why things like CentOS being abandoned, terraform licensing, et. al. never bothered me. I’m not paying them, so :shrug:.
I read this entire post because the man deserves not only to have his say but also to have his say listened to.
That said, I detect a lot of one sided thinking in his post. He took on an incredibly difficult challenge, faced huge opposition, made incredible technical accomplishments and he feels entitled to a standing ovation. When what he receives is criticism, entitlement and obstructionism he takes it personally. If he did all of this work hoping to get accolades, fame, clout, influence then he did it for the wrong reason. There is a mismatch between his expectations and the reality of the world.
In the best of worlds, we do the right thing because it is the right thing, not because we hope for a pat on the back once it is done. In dharmic religions (e.g. Buddhism), one of the principle mental states one is to aim for is detachment from the outcomes of our actions. Suffering is attachment and Martin is clearly suffering. The other thing in Buddhism is recognizing the suffering in others, and I see a distinct lack of that recognition in Martin's post here. He acknowledges his own abrasiveness but not once does he show compassion for the maintainers who have suffered everything he has suffered, perhaps even from actions Martin himself has done.
Martin has several outcomes he wants, mostly his changes (including the inclusion of Rust) being welcomed in the Linux kernel. He is attached to those outcomes and therefore takes it personally when those outcomes are not achieved. Taking a step away from this attachment is a very good step. IMO, his desire to push for these outcomes has been a significant contribution to the toxicity.
I've worked with plenty of talented engineers who behave like this. It takes a certain psychology to achieve incredible technical feats, however they often come at the cost of personality issues in playing with others.
You add these personalities together, where everybody believes they are right an everybody else is wrong then it's a recipe for disaster.
People have to learn to adapt to change, or they get burnt out continually hitting the brick wall.
This is unfortunate, but before pointing fingers, it's worth putting things into perspective.
The linked "thin blue line" message[1] also says this:
> One of the things which gets very frustrating from the maintainer's perspective is development teams that are only interested in their pet feature, and we know, through very bitter experience, that 95+% of the time, once the code is accepted, the engineers which contribute the code will disappear, never to be seen again. As a result, a very common dynamic is that maintainers will exercise the one and only power which they have --- which is to refuse to accept code until it is pretty much perfect --- since once we accept the code, we instantly lose all leverge, and the contributors will be disappear, and we will be left with the responsibility of cleanig up the mess. (And once there are users, we can't even rip out the code, since that would be a user-visible regression.)
Which seems very reasonable. Maintainers shouldn't be expected to support a feature indefinitely just because a 3rd party is interested in upstreaming it. In the case of Rust for Linux and the Asahi project specifically, I imagine this would entail a much larger effort than any other contribution. So just based on this alone, the bar for entry for Asahi-related features should be much higher.
Perhaps this is ultimately a failure of leadership as TFA claims, but it would be foolish to blame the Linux maintainers or its development process, and take sides either way. Maybe what Asahi is trying to accomplish just isn't a good fit for mainline Linux, and they would be better served by maintaining a hard fork, or developing their own kernel.
I don't really follow what's objectionable about Ted's email, or why it's being singled out. It matches my experience as an open source maintainer pretty accurately. It's also pretty constructive (it goes on to lay out a plan on how to constructively proceed and make everyone happy).
One would hope it was just a particularly bad gaffe, but it could also be an insight into how he actually views himself as a maintainer which is not great.
It is most certainly not politically charged language. It's an anodyne statement referring to being a small force keeping bad things from happening. That's all.
The expression exists in only two colour variants, blue and (the original) red. If he'd used the other, you'd probably have pointed out what a colonialist Victorian asshole T'so is, for identifying with the 19th century British army?
The expression itself is inherently quite generic; all the claims about it being "specifically about" anything are just people reading stuff into it.
While words change and it rapidly escalated over the last 10 years, it's been about police brutality/misconduct for at least 50 years now.
Using phrases whose meaning you're unaware of is generally risky behavior. Trying to claim it meant something other than what the common lexicon says it means is just asinine.
Maybe instead of assuming that the person who used the phrase intentionally wanted to politically charge their message with 50 years of US-centric baggage, it would be safer to assume that they were unaware of the political context and/or made a mistake. Or, you know, ask the person to clarify what they meant before jumping to conclusions.
Why this is being brought up in a discussion about Linux is beyond me. Context and nuance matter.
Tangentially: this is a problem with "cancel culture" in general. The mob is willing to lynch people at the mere mention of something they find objectionable. That's what's asinine.
Eh, right. The "far right" seem to appropriate every other thing these days. I can't keep up.
In the context of a long-term good faith maintainer in what is clearly a constructive good faith email, assigning bad faith meaning to a simple phrase is in itself a bad faith action IMHO.
> Eh, right. The "far right" seem to appropriate every other thing these days. I can't keep up.
Is it really appropriating anything, though, in this case? Seems this is more people ascribing the expression to a particular political camp in order to taint T'so by association with said camp.
(Sorry, if you were being ironic or sarcastic in your wording, I failed to pick it up.)
Then 2024 happened. Last year was incredibly tumultuous for me due to personal reasons which I won’t go into detail about. Suffice it to say, I ended up traveling for most of the year, all the while having to handle various abusers and stalkers who harassed and attacked me and my family (and continue to do so).
This is _not_ ok in any form, what the actual hell?
It's hard to know what he exactly means by this. It sounds pretty bad but this guy does seem to attract and develop a fair bit of drama and this could be a bit exaggerated from the truth.
It very likely is. By "abusers" he likely means "people that are rude to him online" and by "stalkers" the same. Nobody is following him around IRL. He probably is referring mainly to the Kiwifarms people that get a kick out of documenting the downwards spiral of someone like Hector.
For clarity, it isn't that. He's referring to a single specific person who engaged in stalking and personally harassing behavior unrelated to the projects. The receipts are all over, you can find them if you insist.
It's not okay but he's giving them ammunition by mentioning them and complaining about them. It proves that it's working.
When I started getting harassed in 2022 over an ill advised post on a site, the only thing that stopped it was to sandbag everything and rethink how I interacted on the internet.
I'm not blaming the victims for anything. But I am absolutely saying that you need to modify your behavior in order to get what you want oftentimes. Suggesting that people make even the smallest changes to their lives in 2025 is apparently victim blaming.
I should have been clearer, I'm specifically replying to this statement: "... he's giving them ammunition by mentioning them and complaining about them. It proves that it's working."
Are we supposed to not speak publicly about being harassed? I feel like a culture of silence only allows bullying to perpetuate, and puts the onus on the victim to change their behaviour, and not the harasser's.
In general, yes. But in the case of obsessive stalkers, I've heard that the standard advice is to "go radio silence", because their pattern if you ever break down and reply is to think "now I know how many times I need to harass to get a reply" and increase their efforts to get through that count faster. So it ends up escalating exponentially.
Of course, ideally some authority should stop the harasser. But you probably don't want to wait until it gets severe enough for that to happen.
They pretty much straight up confirm the identity of Asahi Lina, so enough with the gaslighting that merely mentioning this (and having a totally reasonable discussion on contributor persona policies and sockpuppeting, which is what this is) is somehow unethical doxxing.
i have to admit i tend to get drawn in by dramas like this, it is a fault of mine, but it is what it is.
so anyway, i dug into the other party involved in this, i.e. the accused, and it looks like there is more to this story than what Lina lets on in her Google Doc linked here, perhaps there is another side. i am not going to go into details here, but if you are interested, it's not difficult to find her on twitter or bluesky. (and if you do choose to do so, please don't bother her, i don't want to cause her any harm by mentioning her here).
Asahi Lina is almost certainly marcan's vtuber persona. This has been discussed for quite a long time on HN, reddit, etc, especially after he started interacting with his own sockpuppet on GitHub.
> The project itself is popular, but HN allows people to talk about the fairly compelling circumstantial evidence that Asahi Lina is marcan's alter ego. Per previous discussion, they have /home/marcan and /home/lina on the same box [0], have the same hostname [1], and have similar accents and speaking patterns [2]. Marcan is free to do this, but it's completely bizarre behavior acted out in public which is now impacting the actual Asahi project. Doing v-tubing under a pseudonym is one thing, but maintaining a sockpuppet contributor on a major open source project and pretending to interact with it is a giant red flag.
I mean it was pretty obvious from the get-go when Marcan lent his Twitter account to Asahi Lina to promote their launch day. Who entrusts their account credentials to a stranger?
Hector, I know you deleted your account here but in the off chance you browse by - thank you for all of your hard work and knowledge sharing on this over the years. Hope you find time to take a long well deserved 'mental health break' and destress.
Yes, I second that! Thank you for the unbelievable amount of work you did. I would have never thought that it was even possible to do without vendor support. You made history, and it won't be forgotten or disappear. Now, please take a rest and enjoy life and other hobbies. I wish you the best!
Marcan brings up plenty of good points regarding contributing to kernel.org being stuck in the 1990s.
However, he's got no social skills nor does he have what it takes to man up and understand he won't get his way.
Additionally I doubt that he really is dealing with stalkers to the degree that he is implying; real people don't talk about their stalkers so much. When I was stalked and harassed I kept the details light and didn't provide much in the way of actual community details because I went to the FBI and local police to deal with it. And yes a few people not only got no contact orders, but lost jobs, families, and more over their exposure.
Marcan is extremely talented, but talent doesn't equal "I get to get my way." All the time. This idea that conflicts need to be resolved quickly and in the favor of a golden boy is a millennial/zoomer issue.
I don't doubt him about the stalking problem (by online trolls, not necessarily IRL). He was a prominent voice in the trans supporter community and was regularly attacked by Kiwifarms.
Well, if anyone broke the law or harmed him he needs to go to law enforcement or the FBI. Outside of that, just get used to be thrown abuse of all kinds. If you get tagged by a group like that the more you talk about them the more you give them ammo.
We don't know if he's gone to the cops, but more often than not the police will not be able to help you in these sorts of situations. The way that Kiwifarms target people is incredibly hard to stop and I don't think they particularly need ammo; the way to not "give them ammo" would be to stop standing up for trans rights, and the fact that he hasn't is commendable.
I replied to another commenter about this, but these sorts of things are basically part and parcel of being a public figure.
If you can't handle that, you need to sandbag. There's no other way around it. It is unlikely that Marcan will stop being the center of attention wherever he goes.
If he wants to stand up for what he believes to be right, he shouldn't have a problem with the consequences of dealing with people who disagree with him, sometimes virulently.
Trans issues are a very contentious issue right now and in my opinion the only way to win that type of situation is to not play. Doesn't mean I don't respect those people, but taking large public stances just put targets on your back
> If he wants to stand up for what he believes to be right, he shouldn't have a problem with the consequences of dealing with people who disagree with him, sometimes virulently.
This sounds like victim blaming. I suspect very few people who take a stand are truly prepared for years of abuse, even if they think they are. No one has perfect knowledge of the future.
It's intended that you understand the consequences of your actions. If you're a primary order thinker who can't think past the first step then you got to be way more cautious with your life.
The cops are fucking useless. I got death threats, bomb threats and a host of other shit and it's rare that the cops ever find the actor behind it. There is one particular "bulletproof" email hoster (the one with a bunch of very offensive and trolling domain names) that does absolutely zero cooperation with the police, and even if they did, it's likely that the other side uses Tor and the investigation ends there.
And if the other side is Kiwifarms and its associated offspring, the cops can't do anything at all. These guys are the utterly perfect storm - technically extremely competent but socially they're highly deranged narcissists.
> real people don't talk about their stalkers so much. When I was stalked and harassed I kept the details light and didn't provide much in the way of actual community details because I went to the FBI and local police to deal with it
You actually provided more details here than he did, so I guess that's not true.
>> Suffice it to say, I ended up traveling for most of the year, all the while having to handle various abusers and stalkers who harassed and attacked me and my family (and continue to do so).
If anything, I am more doubtful you had stalkers since you are also in this thread saying this[1]. What an unsympathetic reply. You certainly don't come off like you're aware of the stress and fear that can bring if your answer to other people being stalked is "rethink your life choices."
> Sounds to me like you just need to take a break from the internet friend. If you're that high profile that you're getting hit over and over, maybe you need to rethink your life choices. If it bothers you that much especially.
And getting emotionally involved. As Johnny Knoxville said: when you get emotional you become irrational and when you become irrational everything is real to you
> This idea that conflicts need to be resolved quickly and in the favor of a golden boy is a millennial/zoomer issue.
This is a strawman with no support behind it in the actual blog post. Marcan's issue wasn't that he wasn't "getting his way", his issue (from his viewpoint) was with Linus and co. claiming to support Rust on Linux while doing nothing to aid in adoption and in the case of some maintainers, actively sabotaging its implementation.
If you're going to do something, then do something. Don't say you're going to do something, then do nothing while your underlings actually do the opposite.
If you actually follow the exchange, he threw temper tantrums until he got suspended off Mastodon. I don't know, but that's not what I would expect from a professional developer.
I'm not sure about why they are upset with the issue of upstreaming the changes into the kernal.
They want to upstream drivers for a device that the creator of clearly has no interest in allowing others to use outside of their walled garden. The knowledge around it is from a massive , albeit impressive, RE effort.
Who is going to support it? Where is the demand for it? It would be different if Apple were to provide patches and drivers for their own hardware, at least then you know there is a vested interest in supporting the hardware from the people who know it better than anyone else and have the means to continue supporting it.
I applaud Hector and everyone else that contributes to Asahi, its genuinely a cool project and the progress they have made is insanely impressive given the lack of any official documentation about the hardware they are working on, but its one of these things that will remain in the realm of a cool curiosity much like running Linux on a games console.
You should click through some of the links to see where the clashes actually happened. It didn't have anything to do with the actual hardware support. Rather, it was over stupid shit like the kernel DMA team throwing a hissy fit over the idea of there being Rust bindings for the DMA interface: https://lore.kernel.org/lkml/20250108122825.136021-3-abdiel....
When the response to such a small contribution is just "No rust code in kernel/dma, please" with a complete shutdown of any attempt to discuss alternatives, it's kinda pointless. Even though Rust is supposed to be an allowed language in the kernel now, with blessing from Linus himself, there's apparently just submaintainers of critical, highly shared infrasture that outright refuse it.
So this has nothing to do with "who will own the Apple drivers?!" but just the rest of the kernel going "your integration layers are an affront to our Holy C, begone religious heretics!"
"No rust code in kernel/dma, please" is a totally reasonable request though, once again it comes down to who supports it? When 99% of the kernel is written in C it makes sense to keep it C.
If you start introducing new languages that most maintainers are not anywhere near as familiar with, you create an unmaintainable mess.
Linux isn't a hobby project anymore, its critical infrastructure. You can just introduce changes because a few people think its cool.
Agreed, supporting apple devices is going to be a maintenance nightmare as it goes against the wishes of apple. At best, apple involuntarily makes reverse incompatible changes which breaks the drivers and at worst apple specifically sabotages the drivers to keep people in the walled garden.
This may be controversial but you also don’t have a right to merge in code to the kernel. If the maintainers don’t want rust code then you should write your drivers in C. And if you don’t like that you can maintain your own kernel tree in rust and take on the maintenance burden.
> Agreed, supporting apple devices is going to be a maintenance nightmare as it goes against the wishes of apple. At best, apple involuntarily makes reverse incompatible changes which breaks the drivers and at worst apple specifically sabotages the drivers to keep people in the walled garden.
Apple explicitly chose to provide a way to boot third-party operating systems when designing how the M-series SoC boots. Their SoC stuff dates in some components AFAIK back to the very first iPod SoCs in its design.
I would understand that attitude if someone wished to, say, upstream code for PlayStations or other game consoles because that is a bunch of fights waiting to happen, but Apple hasn't made any move directly against FOSS OSes on their computers in the past and there is no reason to believe that will change.
Great, so you boot your custom OS, and you can... I guess display a terminal and maybe talk to the disk? How do you use the hardware without drivers? Did Apple document their hardware interfaces?
The fact that Apple re-uses components from very long time ago in their SoCs is what enabled Asahi Linux to be where it is in the timeframe it took with the staff they had. Obviously marcan, lina and the others involved are damn geniuses, but even they built on the research of others like iPodLinux that laid foundational groundwork.
But the comment you replied to argues that maintaining those drivers is a nightmare because of lack of cooperation from the manufacturer. What you're saying is that Apple hasn't had the inclination to rip up the floor from under them, but that's an inherently unstable situation that could change at any time.
> Apple explicitly chose to provide a way to boot third-party operating systems
And they explicitly chose against a UEFI interface like prior Macs, which would have actually enabled proper Linux support. Now you have poor people trying to reverse-engineer a Devicetree from scratch to get basic features to kinda work, emulating hardware features in software and working with no documentation from Apple. They "explicitly" chose to expose iBoot because otherwise you wouldn't be able to reinstall MacOS in a full data loss situation.
By comparison - reverse engineering an unsupported AMD or Intel CPU would at least give you some basis to work off of. You have UEFI as standard, ACPI tables provided by hardware, even CPU documentation and Open Source drivers to work off of most the time. Asahi shot themselves in the foot by trying to support hardware that doesn't support them back. You can argue that Apple was conspiring to help, but we have no actual evidence of that.
> Their SoC stuff dates in some components AFAIK back to the very first iPod SoCs in its design.
And none of those platforms ever got proper Linux support either. I love Linux as much as the next nerd, but it doesn't seem wild to suggest that Apple Silicon will never have feature-complete Linux support. And for many people, maybe that's okay!
> a device that the creator of clearly has no interest in allowing others to use outside of their walled garden
This doesn’t seem accurate.
> In macOS 12.1, Apple has added the ability to directly boot a raw image directly instead of a Mach-O, even though Apple has absolutely no use for this functionality. According to Hector Martin (Asahi Linux developer) making things easier for Linux developers is the only known reason Apple would have added this.
"I'd absolutely love to have one, if it just ran Linux.. [...] I've been waiting for an ARM laptop that can run Linux for a long time. The new Air would be almost perfect, except for the OS."
Seems like the same person has even used Asahi to make a Linux kernel release[1][2].
But Linux presumably doesn't have the resources to go chasing hardware platform support based on the whims of a singular Linux kernel developer/maintainer/creator.
I also think it would be cool to run Linux on my MacBook. But that alone isn't reason enough to add support for it.
Linux runs on over 90% of all production servers on the planet, think about that. If you introduce a change it needs to work and be maintained, if you add something but then later realise you dont have the resources to maintain it you can't remove it. You have created more work for yourself. Relevant xkcd https://xkcd.com/1172/
This quite literally the story of nouveau. The driver exists because people need it, and the driver is maintained by the people who wrote it. I don't see why the same doesn't apply to Linux on Apple Silicon.
linux on game consoles might be a small part of the market, but it's a reality. there's a fair amount of manufacturers, devices and people buying them.
Marcan is a legend, it's a shame it ended like this. The communities around certain software technologies can act incredibly hostile and entitled towards developers, paradoxically even more so for free and open source projects. I've seen this many times happen in audio production and game emulation software.
But Marcan clearly has true hacker spirit, I'd wager we'll see him again in the future with an equally cool project. It's often best if the visionaries just spend their efforts to get the ball rolling and then let the project evolve on it's own as they move onto their next challenge.
Hard to call what's going on a "minor disagreement" though. And that's the problem - some people just stonewall and the leadership refuses to do anything about it.
One aspect of all of this which I haven't seen directly addressed[0] is the question of what might be going on with Torvalds and the Rust Foundation from a longer term, dare I say "political" standpoint. Torvalds seems to usually position himself as a kind of anti-political, code-is-code type, but then, perhaps there's more to him and to this story?
There's an awful lot of money and power associated with operating systems and programming languages (obviously), and the resulting "realpolitik" of situations like these seem to get swallowed up in these discussions.
It makes sense for technical people to think that the technical debate is what essentially matters, but it usually never actually is.
I've found the way Linux has approached Rust in the last couple of years to be a tad confusing. Always cutting a hard line, suddenly Torvalds' opinion is quite wishy washy. Oh, we'll try it, who knows, what's the worst that can happen, type thing? What induced this change, one wonders.
[0] Well-written blog posts on the subject are very welcome, please share if you know one!
I've heard "Linus Torvalds has refused to decide whether to accept or reject Rust, and this is a leadership failure".
Maybe I'm don't know the situation enough, but I think he's not deciding because he has no idea. The "wrong" option may create far more consequences than the "right" one (so he can't e.g. flip a coin) but he has no idea which is "right".
Torvalds has spent so long working with C, even if he went out of his way to learn Rust, he'd never have as much experience to get an unbiased view of both. Perhaps he's hoping people who are younger and have more equal experience, but who are still smart and experienced, will shift the project towards the correct decision. Unfortunately that's not happening, because as a solid BDFL, he's the only one who can make a shift of that size. But either shift could be a huge stain on Linux's history, and he has no idea which, so he's stuck in a horrible conundrum.
If that's the case, keeping in mind that Linux is Torvalds's life's work, even if doing nothing is itself a bad choice, I don't blame him.
Regardless, since Torvalds can't decide, I think the community has to come together and 1) agree on a consensus mechanism then reach a consensus (e.g. vote on it), or 2) come up with a magic third option which Torvalds can accept*.
e.g. "integrate Rust into the kernel, but ensure that all future Rust compatibility issues are handled by Rust developers, even if they are caused by C code changes, through the commitment of a team of general-purpose Rust developers large enough relative to the number of C developers". I don't know if one can really ensure there are enough Rust developers to not block the C developers, but Rust is very popular and maybe growing faster than C, so maybe it's possible.
> "but ensure that all future Rust compatibility issues are handled by Rust developers"
But how? One of these:
- The C developer needs to know enough Rust to know which changes will affect Rust, contact the Rust developers and explain the change and wait. Extra work and delay they do not want.
- The C developer does not need to know any Rust, but must be doing Rust builds, and if something breaks then contact the Rust developers and explain the change and wait. Extra work and delay they do not want.
- The C developer needs to explain every change to the Rust developers in case it might break, before they can do it. Extra work they do not want.
- The C developer ignores Rust, and the Rust developers must work with unpredictable shifting sands. This works for a while and the Rust codebase becomes larger and new Rust developers join. This is an unstable configuration. The larger the Rust codebase, the more disruptive any change could be and the new Rust developers will feel the changes are malicious, capricious, and will demand this stops and the changes are planned and communicated and the ground stabilises, or they get fed up and quit. Leading back to one of the above points - C developers either need to maintain the abandoned Rust code, or they need to do extra work of tracking, coordinating, explaining, running changes through another team and waiting for them, extra work they do not want, or they can't make certain changes in the C code base, a limitation they don't want.
Rust developers saying "we will do all the work" isn't enough because of the meta-work of organising and communicating the work.
Yeah, I don't think there's a solution, otherwise someone would've thought of it.
I'm now thinking the solution is "no Rust in the kernel, but we promise to revisit in X years" (then if in X years the picture isn't clearer, revisit in X more years). As someone who greatly prefers Rust, it's unfortunate, but the alternative adds lots of complexity (IMO more than Rust's type system removes) and that's too big an issue.
Moreover, the would-be-contributors who use Rust can (and IMO should) unite and fork the project, creating a "Rusty-Linux" which would be the base for many distros like Linux is the base for all distros. If the fork ends up higher-quality than Linux, especially if Rust adoption keeps growing and C usage starts shrinking, in X years (or 2X, or 3X, etc.) Rust will be so clearly beneficial that nay-sayers will be overpowered (or even convinced), and Rusty-Linux will become Linux.
I can't agree more. Linus position here is not dissimilar to CEOs and alike. He's trying to deal with R4L using his (allegedly) 100% technical and 0% political perspective but in this case it simply doesn't work. R4L is a political challange that require political discussions; who decides? what's the bar? who maintains what? etc
By not answering this questions and saying he doesn't want to have anything to do with the arguments, Linus simply decided that he doesn't want to solve the problem that only him can solve. The result is clear: R4L will fail if Linus decides that any maintainer can stop the "cancer" to spread and block Rust changes.
R4L implies that Rust will be present in the kernel and will need to be maintained. If Linus is ok with maintainers that have a deep/fundamental problem maintaining/coordinating the maintenance of Rust code, R4L will never happen.
Reading this I cannot help thinking about what Linus Torvalds said more than 20 years ago[1] in an interview.
When asked if he feared competition for Linux his answer was that few liked writing device drivers and as long as no one "young and hungry" came along that could write device drivers and liked it he'd be safe.
There you have him, Hector Martin, young[2] and hungry and loves writing drivers. No surprise he clashes with the old guard.
[1] I don't remember the date but it was still on analog TV, so definitely more than 20 years.
[2] At least a different generation from Linus and could easily be his son. Young enough for generational conflict at any rate.
But this conflict isn't about competing with Linux, it's about having a vision for what Linux should be that clashes with the core maintainers' vision.
First, maybe Linux just is bound to always be a C only project. Linus Torvalds infamously dislikes C++, its sorta odd he didn't shut down Rust for Linux in the first place. Redox is on its way...
Second, there are multiple types of compensation. I think the author was probably looking to be compensated in validation from others. Maybe if Linus Torvalds, replied to his email the author would be more inclined to continue.
However, I can't be mad at someone for deciding how they want to spend their time. You only have so many hours in the day.
Would be cool if Qualcomm hired Marcan and worked with an OEM to roll out a series of Arm Linux laptops. That's what we ultimately want.
Marcan had a whole rant[0] in the thread that started all of this about kernel people being payed by corporations instead of being freelance like him. I'm not sure he wants to work for a corporation.
I don't think that "rant" indicates in anyway that he wouldn't want to be paid for his work(even by a corporation). In fact he directly calls it a luxury. He is simply pointing out that people put up with different things based on whether they get paid for it or not. And that given that the current workflow of the kernel may work for people whose job it is to interact with it, but not necessarily for people who aren't being paid.
C++ garnered a similarly enthusiastic following in the 1990s as Rust has in recent years. Not only was it “a better C”, but it allowed to build safe and zero-cost abstractions with a rich type system in a way that C couldn’t. (At least that’s what many C++ programmers believed then. It turned out to not be quite zero-cost after all, and exception safety proved to be highly non-trivial.) Details aside, there is a large overlap in mindset between those favoring Rust today with those favoring C++ in the 1990s. And the complexity of many Rust libraries today is eerily reminiscent of the complexity of C++ template-based libraries back then. So I can see why someone might find it odd.
Okay, that is a fair point. I think the difference though is that while C++ indeed mitigated some of the drawbacks of C, it did so while also introducing a huge amount of additional complexity and dubious features, which you could also argue Rust does, but I think not nearly to the same extent.
Hindsight is 20/20. It wasn’t perceived that way back then. Every C++ feature had a well-reasoned rationale of solving an important problem. (I found the book “The Design and Evolution of C++” worth reading, although that was some time ago.) The only handicap requiring compromises was backwards compatibility with C, but that was also a major reason why it became so successful.
I’m sure that Rust will continue to become more complex, and in 20–30 years another systems language will likely come along that is at least as safe while being easier to work with again.
I can't find the exact link now (very well might have been a video without a searchable transcript), but I recall someone asking Linus that (specifically re his previous comments on C++), and his answer was something like he saw Rust solving a problem that C genuinely does not. It's controversial, but I do think of C++, Zig, etc as solving the "same problems" as C (perhaps in a much nicer way)
His particular opinion doesn't matter if the rest of the old guard only wants C in the kernel.
A lot of pain and drama could of been prevented if he put his foot down and said only C code is allowed. Instead he left it ambiguous, and a lot of good meaning people have been damaged by this. I know I'd be upset if I had been a part of the Rust for Linux team, and when I actually wanted to get my code in I was told my contributions weren't welcomed.
I wouldn't want to mix and match languages in a project that's so vital to really the entire world. It just seems like a good way for a funky rust bug to cause billions in issues...
Staying away from the emotional part of this blog post.
I was about to write a question to ask why, if these downstreams are forked, that it is such a big deal to be gatekeeping the upstream and I think I got my answer from this:
"In fact, the Linux kernel development model is (perhaps paradoxically) designed to encourage upstreaming and punish downstream forks. While it is possible to just not care about upstream and maintain an outright hard fork, this is not a viable long-term solution (that’s how you get vendor Android kernel trees that die off in 2 years). The Asahi Linux downstream tree is continuously rebased on top of the latest upstream kernel, and that means that every extra patch we carry downstream increases our maintenance workload, sometimes significantly. "
Is it wrong for Linus to take the side of the kernel and not of the various distros? Serious question. I don't completely understand all of the background here.
"But it goes deeper than that: Kernel/Mesa policy states that upstream Mesa support for a GPU driver cannot be merged and enabled until the kernel side is ready for merge. This means that we also have to ship a Mesa fork to users. While our GPU driver is 99% upstreamed into Mesa, it is intentionally hard-disabled and we are not allowed to submit a change that would enable it until the kernel side lands. This, in practice, means that users cannot have GPU acceleration work together with container technologies (such as Docker/Podman, but also including things like Waydroid), since standard container images will ship upstream Mesa builds, which would not be compatible. We have a partial workaround for Flatpak, but all other container systems are out of luck. Due to all this and more, the difficulty of upstreaming to the Linux kernel is hurting our downstream users today."
I think we are dealing with a maintenance problem of downstream forks and trying to make their lives easier by convincing the kernel maintainers to accept the changes upstream.
Does Linux have a standards committee or some sort of design committee? I thought they had something to decide what goes in and what doesn't. If it doesn't then is it necessarily gatekeeping then? It seems like someone has to make the hard technical choices whether something becomes part of Linux or not and that is what Linus is doing.
I am trying to understand the real issue here. It seems like the difficulty in upstreaming changes to help the downstream folks is the issue not necessarily that the downstream folks are blocked.
I was thinking the same on this. It is a classic case of division of responsibility. The argument appears to be: should we force a small number of downstream maintainers to suffer a big maintenance burden or should we make upstream handle a smaller burden.
Martin's position rests on his claim that the big maintenance burden he would be forced to bear is unfair. I mean, he is the one who chose Rust. No one forced that on him. It is kind of hard for me to see his claim that the maintenance burden of his choice is somehow the responsibility of the upstream maintainers who are all C developers, no matter how small he insists such maintenance would be. My own opinion is that he made his bed, he needs to sleep in it. Or wait for the tides to change in his favor over Rust inclusion in the kernel. All of this crying foul just doesn't sit well with me.
Agree, these are not the real issues. He has some personal issues going on and is blaming this thing or things as the source of his problems which they are not. It seems he needs some emotional maturity and resilience too.
>Is it wrong for Linus to take the side of the kernel and not of the various distros? Serious question. I don't completely understand all of the background here.
Linus is pro-R4L. He wants to see Rust drivers in the kernel upstream, and has expressed frustration that it has been such a slow process, and that he doesn't understand why some maintainers turn it into a religious issue.
The problem is that he hasn't done much in the way of actually standing up against arbitrary stonewalling by those maintainers. This ensures that everyone gets pissed off and creates a giant mess of arguments. Rust people get pissed off because they were told their contributions were welcome and feel like their time investment is being wasted on bullshit, and C maintainers because the lack of a clear policy leads to ruminating and imaginations running wild.
> I cannot work with those who denounce calling out misbehavior on social media to thousands of followers, while themselves roasting people both on social media and on mailing lists with thousands of subscribers.
That person is someone called `Sima` and their posts on Mastodon are pure gas lighting. These are the worst abusers.
>But then also came the entitled users. This time, it wasn’t about stealing games, it was about features. “When is Thunderbolt coming?” “Asahi is useless to me until I can use monitors over USB-C” “The battery life sucks compared to macOS” (nobody ever complained when compared to x86 laptops…) “I can’t even check my CPU temperature” (yes, I seriously got that one).
>And, of course, “When is M3/M4 support coming?”
This is awful framing. It isn't entitled to ask when something is happenint or to say what makes something unsuitable for you. Marcan seems to take every single social media comment about Asahi Linux as a direct personal attack. No wonder he is burnt out, anyone with such a habit would be...
To be fair, nothing in human history has psychologically prepared us for semi-real-time two-way interaction with thousands or millions many people at once. Once I realized this, I began to become much more empathetic toward popular public figures who “crash out” (as the kids say these days) on social media. I've never experienced this myself, firsthand, but after seeing this happen time and again over the course of the past two decades or so—especially since the advent of smartphones, which dramatically increased the possible frequency of such communication—it's the logical conclusion I've reached.
I think if I was indicted by Linus and told I'm a problem over spreading awareness about a position on social media, I too would burn out pretty quickly. That's how you crush motivation. There's a deeper issue in open-source culture where harshness and gatekeeping drive away passionate contributors.
"spreading awareness about a position" isn't a very accurate way to describe what happened. This is the guy who said he wanted to use social media to create a "hall of shame" for kernel developers. Of course Linus told him to knock it off, that's ridiculously unprofessional behavior.
1. I think the the DMA maintainer is correct. Don't intertwine implementation languages, that is bad idea and a maintenance hell.
2. Social media "hall of shame"
3. Torvalds is forced to make a statement because of 2. Not 1.
"Behold, a Linux maintainer openly admitting to attempting to sabotage the entire Rust for Linux project (...) Personally, I would consider this grounds for removal of Christoph from the Linux project on Code of Conduct violation grounds, but sadly I doubt much will happen other than draining a lot of people's energy and will to continue the project until Linus says "fuck you" or something. (...)"
"Thinking of literally starting a Linux maintainer hall of shame. Not for public consumption, but to help new kernel contributors know what to expect. Every experienced kernel submitter has this in their head, maybe it should be finally written down."
"Okay I literally just started this privately and the first 3 names involved are all people named variants on "Christ". Maybe there's a pattern here... religion secretly trying to sabotage the Linux kernel behind the scenes???
Edit: /s because apparently some people need it."
The issue is that Linus put the Rust developers in an impossible position: On the one hand he approved Rust in the kernel, but then never ever ever has the balls to enforce that decision.
Then, the fanatical C developers openly sabotage and work against all the Rust developers efforts. So, the last option for the Rust developers is to take it to social media. Otherwise, the C developers get away with creating a self fulfilling prophecy: Sabotage all Rust efforts, then claim the Rust experiment failed.
Linus didn't seem to ever have the time to actually take a stance, except of course on the social media issue. Fully ignoring all context. It's the equivalent of a school principal suspending the bullied victim for finally snapping and punching their bully.
The thread didn't really have drama before marcan stirred the pot. There was a disagreement, but the individuals pushing for the merge were not attempting to escalate, only try to find a path forward in a way that might make both parties happy with the compromise. The drama and social media outrage arguably did nothing to help, and as far as I can tell, simply makes for good entertainment for onlookers who like to gossip. While it would be nice to have Linus help out here with a clear resolution after escalation, it's clear to me that the behavior marcan displayed is the higher priority problem to address.
I think this isn't the right take. the "disagreement" was a kernel maintainer saying "Rust in the Linux kernel is a mistake and I will do everything in my power to sabotage Rust adoption" (as feedback on version 8 of a patch). The fact that open undermining of Rust for Linux receives no pushback from Linus or anyone else with power in the Kernel is shocking.
100% this. Yes, Hector went nuclear, but he begged Linus to step in and provide leadership (either merge or reject) and instead Linus ignored the whole technical issue with regards to rust being totally blocked.
Even now with Hector out of the picture, there’s still no suitable path forward for rust in Linux. No wonder why people are giving up (exactly what the blockers want).
>Even now with Hector out of the picture, there’s still no suitable path forward for rust in Linux
The suitable path forward is to submit the patch series like normal to Linus, where it will be merged regardless of CH's NACK. CH isn't able to actually stop this process, he's just being a jerk.
However, I agree with you that it would have been nice to actually publicly clarify the situation rather than ignore it and deal with the whole thing behind closed doors. It shouldn't need to be explained that letting this sort of thing fester is a great way to kill motivation in a project and ensure it's less likely that new people will get involved.
The reasoning Linus himself gives for greenlighting Rust is, among other things, to avoid stagnation.
In practice, CH is doing everything he can to stonewall and sabotage efforts to subsequently bring Rust to solve problems.
This explosion, now, was plainly a consequence of months and months of failure. And that anyone name-calls to blame that failure on “a drama-seeker” leaves me to wonder about the future of Linux 20 years from now.
Counterpoint: prior to going nuclear, is there any evidence that Marcan directly tried to get Linus's attention, given the huge quanity of mailing-list mail Linus is sure to get every day?
I only see Danilo doing that in that thread. And admittedly Linus didn't respond (and Greg KH only minimally responded). But even CC probably means a lot of mail for top maintainers, and at that point I don't see anything that would've gotten in the way of "send a PR despite the Nacked-by", which has been done in the past.
He literally says in the post he reached out to Linus directly and to this day haven’t gotten a response. He also himself was (trying to) upstream patches for years, usually ending up similarly getting stonewalled
I don't see the word "reach" or any relevant mention of "Linus" in either the "shaming" post or in the resignation post.
Even if there was, I'm not sure I trust the word of such a drama-seeker directly, so it's reasonable to a evidence of on-mailing-list appeals adding CC (as Danilo did), and if that fails mention of contacting Linus off-list in that specific subthread.
The rust project is not stalled. There is pushback from maintainers, but as trust is established, things will hopefully change and it'll get easier. Escalating erodes trust and simply will make the process take longer. Anything that fuels us vs them narratives are simply damaging. Everyone needs to focus on the shared objectives of the overall Linux project. Ignoring the maintainers and pretending their opinions are wrong or don't matter won't help. I'm not sure a top down directive is necessarily the right way to establish a healthy space for rust in linux.
I agree. Everyone here seems to be criticising Marcan for not being professional, but it’s very difficult to remain professional when the people you’re working with gloat in public that they intend to completely sabotage your work product despite it being given explicit support by the CEO. Why are you the only one criticising the coworkers? I feel like I’m taking crazy pills reading this thread.
Because this thread is about Marcan's behavior, not others'. I think it's perfectly fair to claim that everyone behaved badly in this situation, but the person I replied to was insinuating that Marcan didn't do anything wrong. That is not true, and why I highlighted his behavior.
> So, the last option for the Rust developers is to take it to social media.
Social media is an amplifier of interpersonal problems, not a place to seek for a resolution for them - unless your intended "resolution" is to beat down the other side, the people you have to work alongside by necessity, via potshots from random strangers who hardly ever bother to inform themselves fully of the situation. That is never going to be a true resolution, and I think Linus, for all his faults, recognizes that and that's why he draws the line there.
The C maintainer in question had no power to stop the code from being merged, it wasn't in his directory. He was tagged as a courtesy in case he wanted to do a drive-by review since the code was wrapping his subsystem. The Rust code being reviewed wasn't written by marcan, and the other Rust developers called him out for taking the argument to social media when the code was likely going to be merged anyway (see https://lore.kernel.org/rust-for-linux/Z6OzgBYZNJPr_ZD1@phen... and https://lore.kernel.org/rust-for-linux/CAPM=9tzPR9wd=3Wbjnp-... ).
The fact is, you need buy in from other devs and if a dev won't buy in you need to work out a way to avoid them or avoid conflict. It sucks, it slows things down, but frankly making it a "them vs us" is a sure fire way to make them oppose any change you want to make.
Public shaming even more disastrous as there's no better way to entrench someone in a position.
I'm not entirely convinced they meant to truly make a public hall of shame.
It sounded to me like a list of "friends who want to get more involved, I'll let you know who to avoid". Then, I read the interactions that sparked that post, and I could totally understand the frustration from OP's part.
Linus being unwilling to take a real stand on maintainers blocking Rust just because doesn't really help.
> However, I will say that the social media brigading just makes me not want to have anything at all to do with your approach.
> Because if we have issues in the kernel development model, then social media sure as hell isn't the solution. The same way it sure as hell wasn't the solution to politics.
To me, it sure sounds like Marcan is making the case that they tried other venues, didn't feel like it worked, so they resolved to using their social media following to shame kernel developers if they didn't stop.
But the point is that the Rust developers have tried literally everything else.
If the C developers make it a "Them vs Us" thing, there IS NO ALTERNATIVE for the Rust developers.
Linus' reaction is quite literally the equivalent of a parent only punishing the loudest child, not the child that's been silently bullying that kid for months.
Don't know what to tell you. The C developers have the keys of the kingdom. It's up to the rust devs to appease them. When you are a new-comer to an old project a big part of that is working with the current gatekeepers to get your changes through in a way they'll accept. That can sometimes mean doing things sub optimally in your view.
In particular, the DMA maintainer didn't want rust code in their DMA subsystem. That sucks, but it means you need to relocate your dma bridge code out of their subsystem. It does mean your driver will be a second-class citizen in the kernel tree (which was always going to be the case for rust).
Linus' reaction was to someone who started a public campaign against another kernel developer and tried to use that following to pressure the maintainers of the kernel to bend to the will of the newcommer. I'm sorry, but I'd also have a pretty negative reaction to that.
The workplace equivalent is you publishing a whistle blowing article against a team in your company because they'd not accept a pull request you worked very hard on. You don't do that. You handle things internally and privately and sometimes you tell the boss "sorry, I can't get this done because another team is blocking the change and they are unwilling to work with me".
And do not mistake my post. I'm not siding with the C dev just because I'm critiquing the rust dev. Guy sounds like he's too stuck in his way. The problem is you don't get a big well working and long running project like the kernel without having these sorts of long-term maintainers that make the calls and shots on what to reject.
>In particular, the DMA maintainer didn't want rust code in their DMA subsystem. That sucks, but it means you need to relocate your dma bridge code out of their subsystem
The code was never in the DMA subsystem. At no point was there ever any Rust code in the DMA subsystem.
CH didn't even look at the patch before throwing the wall up. When it was pointed out that the patch already was the way he claimed he wanted it, he came up with a 2nd excuse, and then when that avenue was shut down he said he would do anything to stop Rust being put in the kernel, period, he wouldn't work with any Rust developers and he wouldn't accept adding a second maintainer for his subsystem that would do that engagement either.
From that point it's pretty clear that all previous engagement was just in bad faith.
> The workplace equivalent is you publishing a whistle blowing article against a team in your company because they'd not accept a pull request you worked very hard on.
The workplace equivalent is your CEO making a public statement that your work is to be supported, then not firing people who openly gloat about their intent to sabotage your work.
I mean, I hope that you'd get fired for trying to publicly shame people who you see as trying to "sabotage" you in any normal corporation, regardless of how much your vision aligns with the CEO's lol.
Not that it even makes sense to call it sabotage considering that most people that were involved in the original debate (in the rust for Linux side) didn't see it like that, that the normal kernel development processes were on their way to actually make the change happen anyways, and that Marcan's actions probably did more to sabotage actual support from other maintainers and Linus himself than the original NACK that started all of this ever did.
(Not that Linus ever even gave a blank check for rust on Linux, so I don't think that disagreements and even NACKs are somehow going against what Linus decided)
> If the C developers make it a "Them vs Us" thing, there IS NO ALTERNATIVE for the Rust developers.
There is always an alternative. Exit the project quietly and gracefully if Linus won't show proper leadership. Don't engage in poor behavior back at the C developers, that is just as wrong.
It's deeply ironic that he's complaining about kernel maintainers supposedly forming secret cliques.
However... this is the same man who made a sock puppet V-Tuber account, and acts in every way like they are two people; even though they've accidentally on-stream shared the system username, shared they have exactly the same kernel version, same KDE configuration, same login, same time zone, even (if I recall correctly) accidentally making GitHub commits as the other person once in a while. He also did this on the Linux kernel mailing lists, where he still maintains the charade.
Point out that's weird, or that it's weird for a maintainer to have a fake persona as a walking female stereotype; and you're the one he shreds and mocks - while simultaneously not denying it. For me, I caught on immediately when I saw the supposed "hijacking" of his stream on April Fool's day, which was her first appearance; and stopped donating. I don't pay people to support stereotypes about women in STEM.
Having an alter-ego is one thing, but I strongly suspect that he had at least one sock puppet here during the drama with HN [0]
* a brand new account suddenly appears, defending Marcan's behavior (the only comment/post ever of this account) with a very similar writing style
* Marcan immediately "notices" the new comment while doing "random search" (how ? he claims he doesn't browse HN, and even posted a screenshot of news.ycombinator.com being routed to 0.0.0.0 to block his own access to it the day before)
* Marcan highlights the comment in question on his media account [1], praising them "at least [this commenter] gets it"
Only circumstantial stuff, but sure smells very fishy to me.
Reminder that he did this on the Linux kernel mailing lists. If I was a Linux maintainer who found out that two of the people I'm talking to, are actually with high likelihood the same person secretly maintaining a charade, I wouldn't be far from banning both.
I also certainly wouldn't take any of his complaints about cliques or brigading with any seriousness or self-reflection afterwards.
This subject also gets inexplicably downvoted and flagged every time you bring it up on Hacker News (look at that, it just happened to my original post); but again, nobody can prove otherwise, and Marcan himself has never denied it, only thrown flames.
Wow, what an uncharitable read. Are you aware of what that term means? He said it was not about literally shaming people, but showing what contributing to the kernel is like, and even clarified it wouldn't be for public consumption. It's a colloquialism for a resource where peers can learn from each other's mistakes. My high school Spanish class had a hall of shame.
It's a magnitude more professional than the extremely over the top and public emails that Linus shares, which HN jerks off over. I too would be burnt out if people were picking apart what I said so closely but clapping when Linus says "this code is retarded"
Yes I'm aware of that quote, which doesn't make sense to link with the original quote because his intentions with the "Hall of shame" are different from this quote.
This message brings up a lot of valid complaints about talented developers being stonewalled and you're honing in on one word that is not being used the way you think. Again, there are dozens of emails from Linus that are vastly more unprofessional than this.
My complaint is not that this maintainer would be charitable in their reads or should stay on the project, but that they are unevenly being examined because they are not one of the greybeards.
If you don't want a maintainer, that's fine, but to claim it has anything to do with professionalism is dumb when this is seen as communication to admire.
"hall of shame" inherently means it's about literally shaming people. If that isn't what he meant, then he shouldn't have used those words.
> It's a magnitude more professional than the extremely over the top and public emails that Linus shares
Since when do two wrongs make a right? I think it's perfectly fair to say Linus hasn't shown the best leadership here. But that doesn't excuse Marcan's behavior.
Brigading has no place in open source communities. There are some members of the Rust community who believe C is obsolete and that C programmers should either switch to Rust or get out of the way. This is an extremely toxic attitude that has no place in the Linux kernel!
The fact remains: Rust doesn’t solve all of C’s problems. It trades them off for a whole lot of new problems, many of which are challenging to address in a kernel development setting (and much less of a problem for userspace software).
This makes the “C is obsolete” position even harder to defend and ignoring the concerns of long-term kernel maintainers is not going to get anywhere! I think these folks ought to learn the lesson of Chesterton’s Fence [1] before continuing on their journey to promote Rust, which does a lot of great things!
> Brigading has no place in open source communities.
Agreed.
> There are some members of the Rust community who believe C is obsolete and that C programmers should either switch to Rust or get out of the way. This is an extremely toxic attitude that has no place in the Linux kernel!
Would you care to share some examples of the Rust for Linux community who have said this? I'm unaware of Hector or anyone else saying anything similar? Or is this just a fear of yours?
I think we should be very clear -- believing the future of systems programming is mostly memory safe isn't the same thing as saying "C programmers should...get out of the way".
I didn't say Rust for Linux community, I said Rust community. Here's an example [1]. You don't have to search online forums and mailing lists very long to find countless others like this.
The problem with the brigading (which has been done by the Rust for Linux community) is that it invites these zealots into the conversation. It's totally inappropriate and not at all constructive towards a compromise.
Plus the stated goal of Rust for Linux is to enable people to write drivers in Rust, not to rewrite the whole kernel in Rust. Yet there are countless people in the wider Rust community that believe Rust is the future and every line of C code still in use should be rewritten in Rust. It's gotten so prominent that "Rewrite it in Rust" has become a meme at this point [2]. There are now many developers in other languages (C and C++ especially) who reject Rust simply because they don't like the community.
> You don't have to search online forums and mailing lists very long to find countless others like this.
So -- you're bothered by people on the internet, but not specifically the Rust for Linux people or the Rust project people? I guess -- I'm sorry people are saying mean things about a programming language on the internet?
There are also just as many (more!) anti-Rust partisans out there too, who say lots of crazy stuff too. I'm not sure there is much to be done about it.
> Yet there are countless people in the wider Rust community that believe Rust is the future and every line of C code still in use should be rewritten in Rust.
So what? Does your C code still run? I'm struggling to understand what the problem is. People are free to think whatever they want, and, if they what to rewrite things in Rust or Swift or Hylo or Zig or Java, that's how many of them learn!
People are free to think whatever they want, and, if they what to rewrite things in Rust or whatever language, that's how many of us learn!
Yes, they're free to rewrite their own projects in Rust. They aren't free to force others to do the same to their projects. That's what this is all about: a prominent R4L community leader tried to use brigading and shaming to force a Linux kernel maintainer into accepting and maintaining Rust code (along with the entire toolchain to support it). The maintainer refused, Linus got involved, and marcan stormed out of the room.
This isn't a debate about technical merits. It's a debate about maturity and what's appropriate for collaborating with others (and what's not). The Rust community has been going through a lot of growing pains over this issue for a while now.
>Yes, they're free to rewrite their own projects in Rust. They aren't free to force others to do the same to their projects. That's what this is all about: a prominent R4L community leader tried to use brigading and shaming to force a Linux kernel maintainer into accepting and maintaining Rust code (along with the entire toolchain to support it).
Nobody tried to force Christoph into accepting or maintaining Rust code. This was stated repeatedly.
I don't see how you can possibly have actually read the discussion and come to this conclusion. At this point you're just making false accusations and contributing to the flamewar.
They're offering to maintain it themselves but that's not good enough for long-term maintainers. It's like when a teenager brings home a puppy and promises to take care of it. The parents know that they will be the ones looking after it eventually.
I wish I knew of a less condescending analogy but I think it gets the point across. The list of former kernel maintainers is extremely long. Anyone who leaves the project, as marcan did, leaves all of their code for someone else to maintain. This is not a problem for drivers which can be left orphaned. For all other code it is a problem!
You're imposing your own rationales on top of CH, not expressing his own.
He expressed complete opposition to having Rust anywhere in the kernel at all, including places he doesn't maintain. He was opposed to any other maintainer deal with Rust for him, even though Robin Murphy (who is already a reviewer on the DMA side) expressed willingness to do so. His initial replies were an exercise in goal-post-moving.
The kernel is not CH's project. It's not his call to reject things he doesn't like anywhere in the kernel, including places he doesn't personally maintain.
Since Linus backed him up on this issue I’m left with the impression that Christoph is not a lone maintainer standing in the way of the inevitable march of progress; that his concerns are valid and shared by the founder and leader of the project and represent the views of other maintainers who preferred not to step into the ring on this debate.
Furthermore, the Rust code depends on his C dma code. That automatically makes it Christoph’s problem when something breaks, regardless of how many R4L maintainers come and go from the project.
If you're forking the Linux kernel then it becomes your own project, de facto, since you're taking over maintenance of the fork. You're free to rewrite it in Rust when you do that!
Where is anyone forcing anyone else to do a rewrite in Rust?
When hellwig likened the R4L project to a cancer, he was implying exactly this. He saw this one patch as a Trojan horse (in the original Greek sense, not in the computer virus sense) to get Rust into the main kernel tree. This brings all of the toolchain and language issues into it. By relegating Rust to drivers only, the kernel maintainers avoid the issue of having to maintain a cross-language codebase and toolchain, whether they like it or not.
Being a maintainer of a project that accepts patches from contributors is like operating an orphanage. Allowing anyone to just drop off their unwanted babies results in an unmaintainable nightmare. You can say that the Rust for Linux team have been acting in good faith but the very public actions of one of their (now former) leaders contradicts this. The stated goal of the project was to allow drivers to be written in Rust. Adding Rust bindings to the kernel oversteps that goal. It's a legitimate concern.
> The stated goal of the project was to allow drivers to be written in Rust. Adding Rust bindings to the kernel oversteps that goal. It's a legitimate concern.
You do recognize that all drivers will need to bind to some C interfaces? So -- your argument (or the argument you suppose Hellwig has) is that it is better that each driver author recreate each such interface for themselves? Now, when these interfaces break as a result of a change in the underlying C code, instead of fixing that breakage at possibly a single place, that one true binding, now a maintainer might have to fix that breakage in a dozen such places? And this is preferable? This will cause less work for the overburdened maintainer?
You are aware this patch introduced no code into the main kernel tree?
It doesn't have to. By becoming a single point of failure for all Rust drivers that depend on it, it becomes the responsibility of all maintainers of the kernel to avoid breaking it when they change the C interfaces. It's a foothold into a world where all kernel maintainers need to run and test Rust builds, something Christoph does not want the headache of dealing with.
When your teenager brings home a puppy and promises you he'll never let the puppy leave his room, you know that's not true and it won't be long before you're the one taking care of it.
Ultimately it's about motivations. Long-term kernel maintainers are motivated to protect and promote the kernel as a maintainable and successful project. R4L developers, on the other hand, seem more interested in promoting Rust than promoting Linux.
>> You are aware this patch introduced no code into the main kernel tree?
> It doesn't have to.
Ah, it's one of those other kinds of Trojan horses that don't enter the city walls.
> By becoming a single point of failure for all Rust drivers that depend on it, it becomes the responsibility of all maintainers of the kernel to avoid breaking it when they change the C interfaces.
So -- I'll ask what the Rust for Linux people asked Hellwig -- what is your suggested alternative? Where do we go from here? Is it Rust drivers not be allowed to common interfaces ever? In that case, what are the Rust for Linux team doing?
Or is it that you would like Linus rethink his decision re: adding Rust to the kernel? And if so, why didn't Hellwig make that case directly to Linus? What's with all this performative bellyaching on the LKML?
Ah, it's one of those other kinds of Trojan horses that don't enter the city walls.
The kind that have to be invited in, yes.
So -- I'll ask what the Rust for Linux people asked Hellwig -- what is your suggested alternative? Where do we go from here? Is it Rust drivers not be allowed to common interfaces ever? In that case, what are the Rust for Linux team doing?
That's not the kernel team's problem. They provide a common C interface. The fact that there's an impedance mismatch with binding to them from Rust code is a Rust problem.
Or is it that you would like Linus rethink his decision re: adding Rust to the kernel? And if so, why didn't Hellwig make that case directly to Linus? What's with all this performative bellyaching on the LKML?
I don't know what Linus's goals are, apart from keeping his maintainers happy and keeping the kernel rolling along smoothly. That's not a small thing. From what I can see, Christoph has been a maintainer for over 25 years.
Does Linus want to have his cake and eat it too? Sure. But I think he earned that right by building Linux into what it is today. The R4L team hasn't paid their dues. As someone else mentioned, it took 10 years for Clang to become a supported compiler for the kernel.
>Would you care to share some examples of the Rust for Linux community who have said this? I'm unaware of Hector or anyone else saying anything similar?
In fact, he said that as his very first reply to that thread:
>Everything else is distractions orchestrated by a subset of saboteur maintainers who are trying to demoralize you until you give up, because they know they're going to be on the losing side of history sooner or later. No amount of sabotage from old entrenched maintainers is going to stop the world from moving forward towards memory-safe languages.
> In fact, he said that as his very first reply to that thread:
I think it's clear from the surrounding context that you are likely over-interpreting some of Hector's comments.
What is the losing side of history here? There is simply too much C code in the Linux project to say "stop this ride, I want to get off and only use Rust" right now. This is a fight about some new code. Rust drivers in kernel and perhaps in the future Rust in other places it makes sense. I believe Hector's arguing Rust drivers are inevitable, because they are already here!
What did I say above:
> I think we should be very clear -- believing the future of systems programming is mostly memory safe isn't the same thing as saying "C programmers should...get out of the way".
As I read it, "the losing side of history" refers to insisting on using C, possibly at all. The last part about the "world moving forward towards memory-safe languages" doesn't suggest a limited scope for the statement.
The thread was not about Rust drivers, it was about adding Rust code to the DMA module. I.e. about mixing two different languages in a single module, thus requiring being knowledgeable about both languages in order to maintain it, thus making the module less maintainable. In fact, a few developers were saying that they didn't mind Rust drivers, if they used the C ABI as-is. Someone wanted to expose new Rust-specific interfaces to support cleaner abstractions from Rust drivers.
> The thread was not about Rust drivers, it was about adding Rust code to the DMA module. I.e. about mixing two different languages in a single module
AFAIK this is false. The patch was CCed to the maintainer as FYI, but all the code was in a Rust a module binding to the C DMA interface. If I'm wrong, show me the code.
>I'm just going by what was mentioned in the thread. If that interpretation is wrong, the thread makes no sense.
You've now discovered why this blew up in the first place. All of the excuses used to reject the code were not just petty but also outright false, and trivially so.
My impression is the average Zig programmer is more interested in making a better Linux than trying to prove Zig can be used in Linux.
There are _already_ dozens of hobby OS projects and embedded groups doing systems work in Zig. Everyone knows Zig is a systems language. It doesn't have a chip on their shoulder.
Aside from hatred of Rust and Rust developers there is a bigger problem. The Rust guys are twisting the C developers' arms to iron out API semantics because there is so much behavior and API usage that can't be defined in C and it's driving C devs insane. The Rust people are doing the right thing but doing the right thing is extremely annoying for the C devs.
The last drama wasn't about the guy not liking drama, he just don't like nor want to maintain a codebase with two languages, but I think they really need to say it directly instead of calling it canser and leaving people to think he was calling rust cancer and not multiple languages codebase cancer
Im pretty sure he specifically said the cancer was trying to inter-op 2 languages in a code base and not Rust. Even went on to say that he thinks Rust is good and recommends people implement new Greenfield projects in it.
> What does social media have to do with bad code, though?
Nothing. That's why this was said:
>> *There's a deeper issue* in open-source culture where harshness and gatekeeping drive away passionate contributors.
It's separate gatekeeping.
I entertained getting involved in the kernel for about 3 days, in college. The process is so complex, I just went on to do other fun things. The drama will turn off others. Organizational apathy is much worse, imo. I have quit jobs for this reason and that was when I got paid to stay.
I disagree. Provoking up a mob on social media will not endear you to anyone. You're just making the gatekeeper's jobs harder, and since their job is hard enough, they will simply gatekeep you to simplify things.
Regardless of whether you think the project should be maintained differently, that's not your call, that's their call. Fork it if you want different policies.
Isn't that also what Linus is doing but on a professional forum, which is even worse? The issue comes down to de-escalation, and there wasn't enough on both sides. It's also not unreasonable to expect more from a figure head who is a role model in open-source development in general.
A maintainer's job is to keep contributors on track and in line so the project moves in the right direction, and he did so on the forum in which it's supposed to happen. Not sure what the issue is.
Linux said no brigading. Hector resigns twice and in the second time, despite saying he wouldn’t elaborate on Rust vs Linux, proceeds to blame Linus and start another social media brigade.
Your comment has nothing to do with someone trying to ironically gatekeep me from commenting on Linus' public behavior in response to my comment about gatekeeping.
Not to mention stalkers ? Doesn't matter how much you love a community, one or two psychopaths can maybe it simply not-worth-it.
It's hard enough in physical spaces to remove abusers (usually the abused just stop showing up), I can't imagine there's an answer for preventing this kind of behavior in online spaces
If you want your code merged in the kernel, you have to think about things from Linus' perspective. You cannot in any circumstances try to shame someone into adopting an enormous and unsustainable workload.
The linked article doesn't say the submitted driver code was awful. In fact, it says Paragon submitted the driver after Linus suggested they submit it.
What the article quotes Linus complaining about is a process issue. Paragon apparently used GitHub's GUI to merge some of their branches rather than the git CLI. Linus would prefer they use the CLI to merge branches because the GitHub GUI reportedly omits important metadata from merge commits, such as the developer email, and encourages uninformative commit messages.
As that kernel maintainer clearly stated this was not because the code was awful, but because the code was written in Rust and it was therefore cancer.
From the horse's mouth (lkml; Hellwig's headers chopped for brevity):
On Thu, Jan 16, 2025 at 02:17:24PM +0100, Danilo Krummrich wrote:
> Since there hasn't been a reply so far, I assume that we're good with
> maintaining the DMA Rust abstractions separately.
No, I'm not. This was an explicit:
Nacked-by: Christoph Hellwig <hch@lst.de>
And I also do not want another maintainer. If you want to make Linux
impossible to maintain due to a cross-language codebase do that in
your driver so that you have to do it instead of spreading this cancer
to core subsystems. (where this cancer explicitly is a cross-language
codebase and not rust itself, just to escape the flameware brigade).
---
Hellwig was abrasive and unreasonable. But there is no need to perpetuate, repeat, and repost absolutely one-sided, self-serving misrepresentations of the words he used.
You don't need to paraphrase. You don't need to guess. You don't need to distill or simplify.
He wrote English so we could read it; stop paraphrasing. It's unhelpful at best and nefarious at worst.
Edit: I think it's very telling that there is a crowd here that would literally downvote the actual quote. Actually it's more sad than anything.
You’re right, it’s mostly any open source code projects too. I’ve tried to contribute to projects where they ignore my merge request and take my code and merge it in under their own account. I call this behavior “demonstrating the moat” in which the Maintainers are more concerned with maintaining a moat around their project that they actively go out of their way to prevent contribution under the guise that your contribution did not correctly cross the moat. Even if the moat is mostly decorative and ceremonial.
Then turn off merge requests if you don’t want your project accepting contributions. Remove your CONTRIBUTING.md. Stop being welcoming if all you want to do is show off your source code. Don’t have a document explaining that I need to sign an agreement to contribute.
Two HUGE roadblocks were self-inflicted: adopting a hardware platform that doesn't care about external development (Apple silicon) and a programming language that has almost zero support in the Linux kernel (being the first ever to try to achieve multilanguage Linux kernel development).
The odds were set against the Asahi Linux project from the beginning.
Totally—on the hardware platform piece, I've owned Macs since the 90s, first installed Linux on one in like 2003ish, tried it on all of my various Macs since at some point, but switched to a desktop Linux machine full time in 2020. I'm generally not one to crap on ambition, but Linux is just so much more enjoyable on hardware that isn't totally closed/undocumented/hostile/unsupported by the manufacturer. With the Macs there was just always something slightly unsupported, regularly broken by updates, or otherwise not quite right, and that was before the architecture switch.
Apple makes great hardware (I have an M1 laptop I use away from home), but if I'm intending to run Linux as my primary OS, I'm buying from a company that is more open to it.
I have no dog in the Rust-for-Linux fight, but it seems fairly obvious that is the major reason for the burnout here.
The financial situation sucks. I just threw a small donation their way but funding a project of this scale just from end users is rarely a viable long-term solution...feel like they need to find some high level corporate sponsors.
My best to Hector, what he managed to pull off with the other Asahi developers is remarkable.
Drama needs to be worked against, not expanded upon.
A lot of what I’m reading seems to make me feel that drama was… not avoided by this person, putting it charitably.
And I'm someone who I believe still sponsors them! Asahi Linux is an awesome, and dare I say necessary, project.
There is value in learning how to relinquish a constant "defensive posture" mentally. (I have struggled with, and am still working through this, personally, btw.) Heading a project like this surely challenges everyone's stoicism, though.
Agreed. There's such a victim mentality throughout this post.
While it's clear that marcan faced into headwinds, they're also definitely not somebody that I want to be around me in any kind of leadership position.
You're not concerned about somebody who incites the internet hate mob against you when they don't get their way in a technical/leadership disagreement?
The amount of drama that this has created for Linux when the entire situation was being handled in a non-dramatic way is staggering.
At its root, nothing about the entire situation had anything to do with marcan anyway and yet somehow he has focused an enormous amount of attention on himself and negative opinions at those he disagreed with. He habitually does things that draw attention to himself (in situations that aren't about him specifically) and then points to _any_ form of criticism he receives and cries harassment.
Of course anyone reasonable would never want to work with him.
> nothing about the entire situation had anything to do with marcan anyway
I agree with you but just a point of clarification. Martin was working on drivers that he wrote in Rust for the Asahi Linux project. Those drivers would utilize the rust API wrappers that were the subject of this debacle. If they were to be included (as he wanted) it would have made his life easier. If they were rejected then his own driver would be made more complicated. So, he had a legitimate stake in this and when he suspected it wasn't going the way he wanted he lashed out.
The irony is, if he hadn't blown this all up, the change could very well have snuck in under the radar.
Avoiding drama is a two-way street. I'm not saying marcan wasn't at least somewhat at fault here, but the reason the drama happened in the first place is a leadership crisis and Linux folks calling Rust a cancerous religion without any pushback. Might be valuable to resolve these problems instead of blaming a scapegoat, don't you think?
marcan wasn't even involved directly in the leadership crisis and made himself the martyr/scapegoat.
That's the entire problem.
He threw a shitfit after the situation was already being handled (not ideally, but handled), got the slightest bit of pushback from Linus and then threw all of his toys away but with high publicity.
I don't think it's the first time, nor the last, we'll see maintainers and otherwise really great people being turned off from FOSS because of the massive weight that entitled outsiders and toxic "collaborators" seem to add.
So on that, is there any guides for FOSS maintainers out there about how to deal with the emotional toll of FOSS, with a focus on self-care, how to say "No", generally just how to deal with the people/human part of FOSS, without focusing on the technical details?
We have a ton of guides from companies and individuals how to make the technical, infrastructure, product, software parts work, but I don't remember ever seeing a guide how to deal with emotions one might feel when doing this sort of work.
I think I'm lucky that it's relatively easy for me to tell people to fuck off when I find them not contributing or being toxic, but judging by the amount of people who feel a real emotional toll, border-lining to the feelings of burnout while working on their dream projects, this doesn't seem to come as easily to everyone, so having a guide/manual about this would be amazing.
I have a few projects that I've abandoned because they only made sense as a FOSS project, and I saw how FOSS maintainers were being treated. I love FOSS, I love the philosophy, and I would love to "give back" one day by making the ecosystem richer, but I've not yet found a project that I love enough to be abused over.
If Marcan (Hector Martin) is serious about quitting Linux development, he will stop contributing to the project using his Asahi Lina persona as well. Until then this is just empty posturing trying to elicit a reaction from the community.
Is he really still pretending that his vtuber persona is a different person? That makes this way more of an embarrassing attention-grab than it already was.
"we were still stuck without DP Alt Mode (a feature which required deep reverse engineering, debugging, and kernel surgery to pull off, and which, if it were to be implemented properly and robustly, would require a major refactor of certain kernel subsystems or perhaps even the introduction of an entirely new subsystem)."
Few maintainers care about the platform in question (to whom it's more a curiosity like maybe 68k), and don't have the hardware to test any submitted patches. It's painful to have to accept code that you can't test (though it may be common in certain parts of the kernel). It's painful to see a bunch of changes for just one random feature on a random platform. It's unclear how the code will affect other platforms, etc.
Now throw in some controversial stuff. The vendor of the platform is Apple and some patches are written in Rust... oh em gee!
Are you saying that PCs do not usually have the ability to plug into a monitor, to charge, and to connect to a USB hub for the rest of your devices from a single USB-C port?
You guys still plug three cables each time you sit at a desk?
It's not ever coming to Apple Silicon on Linux since post-Thunderspy, Thunderbolt is dangerous to implement even in the best of circumstances. You'd have to reverse-engineer and update Apple's IOMMU, write software drivers for the port since it doesn't have firmware and test it across a variety of vulnerable devices to see how secure it is.
that sentence carries a lot of weight. How many millions of users are left in the dust? Last time I touched a PC, the USB-C port could charge the laptop, unless the battery was empty. Then only the barrel plug could be used. It. was. infuriating.
> Few maintainers care about the platform in question (to whom it's more a curiosity [...]) and don't have the hardware to test any submitted patches.
Yeah, from some very brief research I could only find a singular Linux kernel developer/maintainer/creator who said[0] "I'd absolutely love to have [the new 2020 Air], if it just ran Linux".
Who knows if that one person has even used Apple hardware before or has access to the necessary hardware to put toward a practical use such as a "development platform" while travelling, or, "doing test builds and boots and now the actual [Linux kernel] release tagging"[1][2], let alone be supportive of experimenting with Rust in the Linux kernel[3].
The history of Linux demonstrates the project doesn't have the resources to go chasing support for a hardware platform just because Linus cares about the platform in question...
...even if at least "173 people, and many more" "contributed both to the Linux kernel side but also to the upstream Rust side to support the kernel's needs" in the initial merge[3].
One thing that I notice often - people just do not take 5 minutes to appreciate someone or send 5 dollars to a project that they use daily etc. Not just in the software world, but life in general. Someone I know works full time, takes care of her two kids, takes care of her husband. Kids are almost teens now, but she gets zero support from them or her husband. Or even an occasional thanks. I see this everywhere, this is just one example.
Are we so busy or egoistic or ignorant that we can't stop and say thanks? What is even more worse is the entitlement. People who wouldn't lift a finger to help anyone (even their own families) are usually the loudest and the most entitled ones.
I don't know if this is the case around the world (probably is?) and I don't know what the solution is. It just sucks
> Kids are almost teens now, but she gets zero support from them or her husband
This is a good analogy. Children are the people who they've been raised to become, so it stands to reason that people will give money and appreciation in the same ways that these things are originally given to them. These things are social constructs; we are inevitably taught how to use them by way of social dynamics. This is all to say that people love to support their darlings... but they've been socially conditioned to expect reciprocity in all transactions. That's how the sausage is made in content-based monetization -- you produce the actual product at a loss and then try to claw it back selling high-margin merchandise that nobody'd ever buy otherwise. The merch acts as social permission to finally do your part and pay the creators.
To risk stating the obvious: this is not a good thing and I think the majority of people would likewise agree. People should be fairly rewarded for their work and we should desire a culture which openly and freely encourages doing so. Culture, however, reflects society. The society we've created is transactional, so that's how people frame the spending of their money and efforts -- indeed, "spending" and "transaction" are practically interchangeable in our collective lexicon. Effort isn't strictly scarce in the same way eggs are, however, so we fail to value it.
> I don't know if this is the case around the world (probably is?) and I don't know what the solution is. It just sucks
It's not a total disaster... so we'll undoubtedly continue to ignore the cracks in the foundation. Martin still got paid good money for his efforts and it was good for him for a time. That podcast you like will sell enough t-shirts and get the rent paid on time, at least for a little while longer. This seems to be about as good as we've collectively agreed to make the world for the time being. A local maxima, so to speak: we've gotten stuck asking for more when less might do better. With a bit of luck and effort, however, we can still catch that pendulum when it eventually begins swinging in the other direction. That's my hope, anyway! For the time being I try to do the things I'd like to see become normal in a more decent world -- sharing generously, paying for the things I like, etc. -- because hopeless accelerationism is for chumps.
I'm sad to see this happen. I bought an M2 Air a year ago and I've been running Asahi on it since the day I got it. It hasn't been without hitches, but that was something I signed up for.
I'm incredibly grateful for all the work Hector and the others have done on this project. The Air is my dream hardware (I'm a sucker for sleek fanless laptops) and getting to run Linux on it is quite amazing.
With all due respect to the effort.. Projects built around unsupported ways to use someone else's products, aren't they doomed from the very start? This race that you can't win isn't it a recipe for imminent burnout? I don't even mean that Apple has no interest in this project in the best case, but why Linux maintainers would want to accept changes that are necessary to support such a marginal use case?
Stuff like this is fairly common. Drivers for the Wiimote and other Nintendo controllers were not put in the kernel by Nintendo employees or with their blessing. Sometimes you want something to work and the only way is to handle it yourself.
First of all, yay Asahi — one of the great modern hacking / hacker stories in my opinion.
Second, what’s the drama? I read the blog, and I’m guessing that on top of being burned out, which sucks, Marcan didn’t like a kernel developer using the phrase “we are the thin blue line” implies he’s politically liberal, in the US sense. He then says he may have been toxic on Mastodon, which might have got him secretly canceled?
All that said, I found his assessment of downstream v. upstream economics (if you can’t successfully upstream you’re doomed to rebase off a massive patch list) pretty interesting. I think the way it is now is the only way that’s good for users — if downstream forks could somehow maintain viability longer term, we would be splitting, e.g. security budgets, performance budgets, etc. I get that it sucks for a group working to upstream, and I am in no way shocked to hear personal politics plays some role in success upstreaming — open source is largely a personal / social capital economy - I guess all that said, hopefully the new Asahi maintainers can work well across what seems like ideology bounds. Maybe?
My understanding from afar is that a Rust dev wanted to interact with C code in a way that the C maintainer didn't like. This led to the C dev saying that while he likes Rust, he believes multi-language codebases are cancer and would stonewall all Rust code that touches his code.
Marcan watched it unfold on the mailing list wanted Linus to step in and force the C dev to play nice. Since nothing happened, he went to social media and lashed out as a last resort. That's when Linus finally chimed in and pretty much said "you might be right, but this isn't the way to handle things".
> My understanding from afar is that a Rust dev wanted to interact with C code in a way that the C maintainer didn't like.
He didn't want any Rust code at all touching his turf. He outright NACKed without any technical reason and refused any negotiations with the Rust team that any Rust build fails due to C code breaking changes would be their entire responsibility.
> This led to the C dev saying that while he likes Rust, he believes multi-language codebases are cancer and would stonewall all Rust code that touches his code.
I believe that was an attempt of damage control to save face and he actually meant to call Rust "cancer".
Anywhere really, it's just the phrase 'there's a thin [or fine] line between...' modified to say the police are that line, between lawful order & disorder.
Apparently it's political in the US, I have no idea, but as I understand it the maintainer just means 'I am here reviewing the change to keep the kernel in good order'.
The maintainer might mean that, but words have meaning. That particular phrase is overly charged and carries a specific connotation surrounding the idea that police are the sole line keeping society in shape.
It’s a poor choice of words for such (relatively) public communication.
These words have the meaning you're implying to at least some people in the USA, it was new to me.
I don't know his full biography, seems to be Chinese born and went to MIT, but he signs off 'Cheers', I think it's a reasonable possibility that he doesn't mean whatever politically charged US meaning it has by it.
You're, with your US perspective, saying 'hey words have meaning you know, don't downplay murdering homosexuals' while millions^ of people smoke fags in the UK every day.
(^probably? Maybe not any more, a lot of fag-smoking relative to murder at any rate.)
Thank you for pointing out the blue line comment, I didn’t grasp it and continued. If you are right I’m saddened to see this type of politics being played in the kernel development. Especially since the Russian maintainer incident.
Well written. I think the accusations of drama-hunting are unfounded. What else are you supposed to do in the face of persistent "non-technical nonsense"? Linus doesn't seem to care.
Definitely a shame. I wonder if it would be in Apple's interests to actively support Linux on Mac. It would make Macs more attractive as developer machines, and I don't see how it would disadvantage them.
That's not really Apple's game anymore. They don't really care about selling Macs, what they want to sell you is their ecosystem. So that way you'll get a Mac and an iWatch and an iPhone and whatever else they push out once they get their hooks and lock you in.
Things don't happen instantly. You could have doubted it was in Microsoft's interest to support Linux within Windows for years... until they released WSL.
Windows machines already support Linux on the hardware level. Getting an Intel or AMD CPU to virtualize a Linux kernel is simple. Worst case scenario is an Nvidia GPU, but since you only need the compute drivers in WSL you avoid Nvidia's graphical issues. There isn't any work required since the OEMs already did it, the only thing required is Hyper-V.
Apple is selling a custom CPU core that has no driver support for anything but XNU with a BSD userland. It doesn't support UEFI, it depends on Devicetree bindings and would demand constant updating and support to render a "first class" Linux experience. Once again, anyone with a protracted interest in staying supported by upstream Linux should not be using a Mac and praying the community cares enough to make it good. Apple knows it's a novelty, and they're not going to take it seriously because that's just what they do. MacOS and the App Store is profitable, Linux is not.
The Rust criticism is valid - the hidden "breaking changes" have been a hallmark of rust releases and anyone maintaining a serious codebase in rust needs to really be on top of testing and validating if they want to stay current - moreso than any other language in the same stable as rust.
You can't however claim the right and reason to refuse PRs and code changes under the moniker of maintenance while simultaneously claiming that the rust community is "actively hostile to a second rust compiler implementation" - you can't have it both ways.
The entire narrative is very indicative of the state of open source unfortunately - incredibly adept programmers getting chewed up by the externalities of maintaining code (and by extension an application). Sometimes it's unappreciative users asking for handouts and sometimes it's other developers (or an organization's development resources) causing contention for a project's trajectory.
I think the entirety of it can be summed up as: forking is there for a reason.
Perhaps it is no longer realistic to push such a huge changeset into linux anymore. Could this be solved with some hypervisor layer? That is, a hypervisor doing most of the work (in rust) and a small support layer upstreamed into the kernel? Of course, no actual virtualization is even necessary. Just some kind of ABI to the kernel running underneath.
His two major complaints are demanding consumers and the rejection of upstreaming his rust driver patches. He has some legitimate complaints since users depend on kernel features like Mesa Integration and Docker support for GPUs, which are critical now that people are training & doing inference .
I still think the best outcome is to fork and recruit some lieutenants for community management. To me the community is losing a lot with his departure. His complaints are legitimate and hopefully linux kernel team can better accommodate his patches. many distros and corporations deliver tremendous value from their forks and it's a better solution than quitting.
In an effort like Linux it's difficult for everyone to have their way - there are many "ways" but only one kernel.
I have had to do maintenance on a distributed filesystem driver at one point. This was outside the kernel. I can see why no kernel maintainer would have wanted to look after it even if it was open sourced because the file server was a mixture of C++ and an interpreted language and working out if you'd broken the driver somehow was a miserable job. You would need obscure expertise to understand it all.
Anyone with a good idea can still fork Linux. If their idea is so great it may end up that their branch gets adopted - if they bother to maintain it.
Supporting and developing Rust is a nice to have, but too often its proponent try to force stuff it deeply inside important stacks that are using other languages like for Linux, or what is going on with major things in Python also.
Here we can see the case, that is almost a blackmail that the Linux community is not nice and will die if they don't make Rust support core and mandatory.
My point is that, if you like Rust and Rust is so nice, just go all-in, do your own kernel, do your own stuffs, and we will see the result in the end. But don't ruin existing good stuffs that were working well on their own.
I found it interesting, because what the writer calls "entitled users" I would call "honest feedback". The features they need may not be what you need. Sure, they are "entitled" in that they are hoping someone else will just hand them what they want, but that's just a human thing.
I'm sorry that it burned him out. But I think it is intrinsic to putting anything in the public sphere that it won't match everyone's hopes/desires/needs, that you will get such feedback, and that you have to find a way to be OK with that. (Or step back from such roles, as the author did.)
As a 20-year Mac enthusiast who knew a thing or two about Hackintoshing, Marcan got me excited about Linux. The idea that someone could just say, “we can solve this difficult problem and make it work, with little external help”? That is the core of FOSS right there, and he helped so much work happen, so quickly.
Loved following his various social feeds. I was sad when he stepped away from the fediverse. I hope he comes back as just a regular hacker without this massive weight on his back.
Modern Linux kernel is a corporate project which is open source.
It seems that it's impossible even for very talented folks to push such work load without corporate backing both without monetary compensation and other resources like non-public documentation. It is very impressive that Asahi team has achieved, but the more hardware advances, the less realistic is to support it on enthusiasm alone.
Don' let the door hit you on the way out i guess. I mean it sounds mean but the scene can do with less toxic types like this, who try to bully people who know better, in order to get their way. Even worse in this case since he actually tried to create a social media shit-storm against those he disagreed with.
He even has defenders calling for yet more CoCs in order to better "deal with" people who "get in the way." It doesn't get much more toxic than that.
Damn shame, both for the burnout and for the project. I was talking to the team which laptops next to get and kind of everyone wanted something like Arm with battery and performance and there are only MBPs (except damn notch) out there. Except no one wants MacOS. And then there's Asahi. I said it might come down to exactly this, what's happening now.
Buying MacBooks and doing your development work in a full screen Linux VM (with e.g. UTM) is a surprisingly good solution, that’s what I’ve done for the past year with no regrets.
> My personal Patreon will be paused, and those who supported me personally are encouraged to transfer their support
That seems silly; just assume people know about the change in circumstances and are trying to give you money for whatever you are doing at the moment, or in gratitude for what you've done before. The person exists, why pause the personal support?
The Patreon was specifically started to fund the Asahi development, so it's fairly disingenuous to continue taking those donations as not everyone will keep track. He can always start a new patreon.
Thank you for your efforts. Unfortunately people are people and the loudest are usually the complainers. I wish we, your supporters, were able to muster as loud of applause when the complaints come in, but we are too busy enjoying your work. Thank you again.
Sidenote: thank you also to all those project leads that put themselves out there. Not easy being the $h1t magnet.
> For a long time, well after we had a stable release, people kept claiming Asahi Linux and Fedora Asahi Remix in particular were “alpha” and “unstable” and “not suitable for a daily driver” (despite thousands of users, myself included, daily driving it and even using it for servers).
> "If you are interested in hiring me or know someone who might be, please get in touch. Remote positions only please, on a consulting or flexible time/non exclusive basis"
I read the post and I am thankful that these people exist, most of my life is made possible through the selfless work of wonderful open source developers like this person.
Sometimes I wish I was so passionate, whereas my philosophy in life towards strangers is a much simpler “fuck you, or pay me”. It allows me to sleep fairly well at night.
I have a few toy projects on GitHub, a couple of which gained a tiny bit of popularity, and I simply ignored every new feature request that I didn’t need, and especially those large PRs “here, I refactored your code to make it functional and it’s so much better now”. I simply said, with no regret: “I won’t merge this, feel free to fork the project, if it’s better I might even switch myself to your project!”. Some got mad, but I truly and genuinely couldn’t care less.
Was any of the Corellium port [1] usable for mainline Linux? In theory, they have a commercial interest in Linux for Apple Silicon, plus emulation tools.
I actually helped with some of the beta testing for it. They worked on it very early on in the Apple silicon Mac lifecycle, and it hasn’t been updated in nearly half a decade. I consider it a proof of concept.
> We kept playing a cat and mouse game with the manufacturer to keep the platform open, only to see our efforts primarily used by people who just wanted to steal other people’s work, and very loudly felt entitled to it
#2 is the cause of #1.
Running commercial games for free has always been the killer application for jailbreaks/custom firmware for game systems. And game system emulators, for that matter.
It's sort of a perfect storm of gaming enthusiasts (particularly those with more free time than money) who want to run commercial games for free vs. companies like Nintendo who have an affinity for (overly) vigorous copyright and trademark enforcement, with jailbreak/custom firmware and emulator developers caught in the crossfire.
Thank you Macan! Asahi is a heroic effort, but way larger than a hobby. To be sustainable, it would require full buy-in from Linus and kernel maintainers indeed.
Hope you enjoy well deserved better hobbies and family time.
This the functional equivalent of a self-pitying LiveJournal post by a moody teen that's been called out by his friends for being a bit of a dick.
Marcan wants to use social brigading to get his way, Marcan wants the entire Linux kernel dev flow to bend for him, and, when none of his poorly presented demands get him what he wants, he is - of course - the victim here.
Asahi is neat, but it clearly isn't a working daily driver yet, and it's not abusive to make feature requests and bug reports. Discussions around Rust in the kernel are not, and can never be, an 'injustice'. In Marcan's world, everything other than vehement agreement and immediate compliance is abusive, hostile, toxic, etc. But of course, the only toxic person here is the one threatening to wield the mob to get his way.
Honestly, I'd query whether the benefit is worth the cost. I'll take average code from well-adjusted anons over clever code from bullying, hyper-online micro-influencers any day of the week.
My perspective is it might be a chicken-or-egg problem. I would willingly donate to the Asahi project...if/when I were to buy a Macbook for the purpose of running Asahi Linux on it.
And I've been watching from the sidelines, waiting for Asahi Linux to become "stable" enough to consider buying a Macbook and putting Asahi Linux on it.
I was an early donator to marcan and never expected any delivery from Asahi project, just thought that the idea of having Linux of Apple Silicon was awesome and worthy of my support.
But then marcan told his supportes to fuck off unless they commit to supporting his political ideas, which I was not willing to do.
I guess this comment will be seen as abuse from the HN crowd. Oh well...
He seems to take a lot of things that are fairly neutral or merely untactful as being bad faith. Like the "thin blue line" comment by Dr Greg. Sure the whole thing blue line thing in reference to the police misrepresents their role in society (police in most western countries are fairly useless at best and actively harmful at worse), but Occam's razor just suggests that Dr Greg was just making a tactless remark rather than being super pro po-po.
It sounds weird and somehow I don’t think it was communicated that way, even if I wasn’t there. You say he said I don’t want your money unless you follow my ideology? Did he then send you the money back after checking your social media?
Out of curiosity what are his political views? It has been mentioned a couple of times here already and it seems to be part of the story.
There is really not much to elaborate: I was supporting the idea of Linux on Apple Silicon. I would support that idea even today (without even installing it really) but because of marcan's twitter/mastodon (I don't remember which) posts. Btw, those accounts are on longer active, I presume marcan deleted them (go to marcan's About page and see for yourself).
Now, if you want me to explain what political ideas those were: I don't care. Whatever they are, I don't want to support it, even if I have those same ideas. Yes, I do think that open source communities should move away from the politics.
> But then marcan told his supportes to fuck off unless they commit to supporting his political ideas, which I was not willing to do.
Is this about Marcan’s outspoken support for transgender people? If so, why not simply say that in your comment, rather than framing it in such vague terms?
…So it is about the fact that you object to his support for transgender people?
Surely you see why this is, actually, directly relevant and important context for your statement. It’s not some general political leaning you’re talking about - lumping this (prejudice against a minority group) into the same category as something like banal disagreements over taxation policy amounts to deliberately obscuring what you’re saying behind innuendo.
If you’ve got something to say about his political views in a public forum like this, at least do the people around you the courtesy of being upfront about what you’re actually saying.
And here we go again, there is never a neutral position with some people.
I support the freedom of people chosing their sex or gender. At the same time, I'm not willing to fight their wars. And if they force me to go to war, then I pass.
Comments like this are why people find transactivists so irritating. Whether they happen to agree with them or not. No-one even mentioned transgender issues yet here you are bringing them up to start an argument. Not everything is about trans you know.
Given that this is seemingly the thing for which he’s most known, politically, it doesn’t seem remotely unreasonable to infer that the person I was replying to was talking about it. It’s absurd to take this evasiveness as anything other than bad faith.
If the group in question were gay people, or a racial minority, would you still treat the issue this way?
What a shame. Seems like such a bright and articulate person.
Working/contributing in FOSS is already slave labor in itself (literally, billion dollar companies depend on FOSS and many do not contribute back to the ecosystem they depend on). Then the abuse from other FOSS developers and community is just cruel.
Hope the guy is able to recover mentally and physically.
It's awful that someone who is doing this kind of work as a volunteer experiences harassment for not making proprietary hardware work with open-source quickly enough.
"... I ended up traveling for most of the year, all the while having to handle various abusers and stalkers who harassed and attacked me and my family (and continue to do so)."
Hmmm. Seems to me they took on a little too much technical risk at the start of an already complex project, to wit, buying into Rust on the kernel side.
Rust in the linux kernel was always going to be a long game. You don't want to have that be a blocker when you really want is to make larger kernel changes.
I'm really sorry about this and I've seen your magnificent work bring so much joy to people. The harassers and abusers should be named and shamed so we can strengthen the resilience of the open source community. Let's see how they react when they get a taste of their own medicine.
This was a heartbreaking and terrible read. I've been feeling a bit of disillusionment about linux quite a bit lately, mostly due to bad/weird decisions being made at the distribution level. Reading this extends that disillusionment down to the kernel level.
The problems cited are portrayed as sociological problems, but I really wish people could recognize that all of them can be mitigated, either substantially or entirely, with a single purely-technical solution: microkernels.
* Almost nobody needs to upstream code to the kernel
* Trusted codebase size becomes negligibly small
* Maintenance burden for drivers, subsystems, etc., falls on the users of the subsystems affected, and not the entire community
* Broad language compatibility by service interface instead of ABI compatibility. The need for a singular compiler is reduced in scope down to the size of the subsystem instead of the entire ecosystem.
The biggest problem that can't be solved purely technically is the entitled user problem, but even that is partially solved. This is because the barrier to contribution is substantially lower:
* I can write code in Rust, but I don't know C.
* I can easily write simple drivers for some hardware features like battery managers and fan controllers and temperature sensors, but I don't know anything about kernels.
* I have a lower, but non-zero understanding of security, and would not feel comfortable writing code that runs on ring 0, but wouldn't feel inhibited writing code that benefits from process isolation.
Those attributes about myself inherently mean that for a microkernel OS, I can be a contributor, but for Linux, the best I can be is an entitled user.
Do the demigods of the Linux kernel - Linus and the core maintainers - personally want the kind of code Asahi is developing to be merged into the kernel? The author writes as if part of his drive was that Linus himself showed enthusiasm for getting Linux on Apple Silicon.
If there is interest in the work Asahi has done, then the Linux team needs to describe what they see as the gap between today's code quality and support model and what they want to see before upstreaming.
It sounds like the Linux team has been wishy-washy and needs to draw a line in the sand on their needs rather than handwaving about being part of the "community".
It would be fair to say "we don't like your attitude or trust you to work with us kindly over the years and don't want to deal with you", if that's the case. Just don't dance around it.
> The author writes as if part of his drive was that Linus himself showed enthusiasm for getting Linux on Apple Silicon.
Perhaps the author jumped to conclusions after Linus himself started using Asahi Linux on his own laptop for Linux kernel development[0]. Note the praise for the Asahi team in the commit message.
And it sounds like you read just one side of the story, with no background in Linux kernel internals. This is about how C APIs are typically inherently unsafe, Rust people wanting to build safe(r) abstractions on top, and a question of who is responsible for changing what code when it's time to refactor the underlying C API. And yes, marcan is being overly dramatic, though he is not alone in that.
I do not claim a background in Linux kernel internals. I'm a human who has seen teams miscommunicate in the past, and see it now. I'm not sure what I said that drew hostility.
It sounds like you agree with me though. The Linux team needs to clearly define the expectation they have for code maintenance from the team trying to upstream Rust code (edit) and the Asahi team needs to acknowledge how/if they can meet those expectations.
Well, stop accusing people who's work you don't know of being wishy-washy.
The challenge is not dictating from high above some criteria; the challenge is discovering the criteria that will let the Linux project continue development as well as can be arranged. This is why you'll hear Linus say it's a learning experience, and not just make proclamation of how things shall be (at this stage).
I don't think it's reasonable to read my original comment as an "accusation". I said "it sounds like" as part of a casual conversation about the subject. There was no hostility in my comment.
Would you say that the Asahi team wasn't receptive to the pace at which the needed criteria were being developed?
My point is that between these two groups there seems to be a misunderstanding of expectations. And being the upstream org, and not having read every mailing list thread, I would expect the kernel team to have built a framework for accepting this kind of code. Or a framework for building the framework.
The framework is "we will discuss it and a consensus may emerge". Linux is an open source project, not a company trying to remain profitable for the next quarter. If some outsider stumbles in and expects something else, well, they didn't understand what they were getting into.
Having structure to decision-making doesn't have anything to do with having a profit motive. Asahi nor myself as an outsider have brought up money.
It sounds like the implicit answer to "Does Linux want Asahi contributions" is "low priority". Which is fine if that's communicated.
I sense you have been involved in these discussions already and have a strong opinion about the specifics of this topic. I don't mean that in a bad way.
You continue to misunderstand open source. "Linux" in incapable of wanting things. Linus as the quality gatekeeper of the main repository cannot tell anyone what to do; his power is purely saying "I won't merge that". The project is what the project contributors do, and nobody can tell them as a group what to do. It's herding cats, not a business meeting.
I have very little personal interest in Asahi, I am not really part of that "conversation", but I dislike outsiders coming in and expecting to dictate how something that predates them should work. Everyone is entitled to their opinion, but that doesn't mean anyone else has to listen to it. If you want to understand, read linux-kernel the mailing list and watch people like Al Viro work (Minimum realistic time allotment: multiple months).
from just reading the post sounds like a shit situation for him and there is blame on the linux kernel maintainers for not handling it well. Especially if the claim about them going behind his back and shit talking him is true.
What i will say, Is that i get the impression that the he is quite sensitive and takes things too personally which is a death wish online. People are savage online and if you don't have thick skin then they will destroy you. Not justifying it but it is what it is
Either way hope all goes well for him. Sad to see this happen i bought a m2 mac because of linux support :(
They would have to play catch up for a few years rebasing off the latest and then eventually lose interest and it will all stop working. Same thing that happens with most arm devices that aren't upstreamed.
I was contributing a (very small) amount of money each month until 2024, when it seemed like Marcan had stopped working on the project.
It was hard to decide to stop the financial support. On the one had I want maintainers to be able to take breaks without worrying about their livelihood. On the other hand, it was very difficult to tell what was going on and whether Marcan would ever get back to working on the project.
It's a shame to see but inevitable when trying to get open software working on such a proprietary platform with so many hardware cut corners and incomplete/non-standard implementations. Just an insane amount of work of the most frustrating kind. What they did manage to do was incredible... but Apple is Apple.
My sense from the article was that it seems less of an Apple is Apple issue, and more of a Linux maintainers are Linux maintainers issue. The big problems listed are all interpersonal conflicts and a sense that the maintainers were making his life upstreaming Rust changes hell.
Sure Apple being not as great to develop drivers, but ~99% of the article is about displeasure working with Linux maintainers.
EDIT: Here is the breakdown by paragraphs.
| Paragraph # | Tone |
| ----------- | -------------------------------------------------------------------------- |
| 1 | History recap |
| 2 | History recap |
| 3 | mentions Apple and M1 in positive light. |
| 4 | Mostly positive, slight negative towards Apple not being having good docs. |
| 5 | Negative Linux kernel development (upstreaming) |
| 6 | Negative user focused |
| 7 | Negative user focused, complaints about M3/M4 support. |
| 8 | Negative user reviews |
| 9 | Money troubles regarding support. |
| 10 | Linux maintainers, mostly negative |
| 11 | Unknown 2024 event |
| 12 | Negative about users (demanding more features and support) |
| 13 | Stress about Kernel development |
| 14 | Negative about Linux kernel development roadblock |
| 15 | Negative about Linux kernel development (Linus leadership failures) |
| 16 | Positive about Rust |
| 17 | Negative about Linux kernel development (Why Rust can't wait) |
| 18 | Negative about Linux kernel development (downstreaming) |
| 19 | Negative about Linux maintainer (thin blue line) |
| 20 | Negative about Linux kernel maintainers (two faced) |
| 21 | Negative about Linux kernel maintainers |
| 22 | Negative about Linux (disapointment in refusing invitation) and Linus |
| 23 | Negative about Linux maintainers (being corporate) |
| 24 | Negative about Asahi, and dreading to turn on Apple |
| 25 | Negative about burnout |
| 26 | Resignation |
| 27 | Positive about Asahi Linux team |
| 28 | Hiring proposition. |
18 negative paragraphs
2 about Apple (11%)
4 about users (22%)
12 about Linux/Linus/Maintainers (67%)
I was wrong about it being 99% about LKM but it's more accurate than saying 50% of issues are Apple.
All those "Negative user focused" are actually "Apple proprietary problems" (problems that exist only on apple because of their weird proprietary stuff with no standards following, ie, getting temp on any other system is dead simple). Not sure if you actually believe in your list or you just used an AI that made the mistake.
I counted paragraphs topics myself: History (3), Proprietary Hardware Problems (8), Kernel/Rust problems (8), Other/Quitting (7). Could be off by one or two because I'm not a machine.
Also, I note that you did not disagree with or object to my initial response (the one made before your edit) that it is indeed halfway down the page before rust/kernel stuff is even mentioned.
I think it is safe to say that both Apple and the kernel/rust issues matter here and trying to derail any discussion of Apple's role into even more rust ragebait threads in a HN topic full of them is counterproductive.
No. Those are issues caused by users. You could have most open hardware platform, and it would still persist. See any OSS maintainer complaining about unrealistic user expectations.
How are you counting those? I made a table, point me which exact paragraphs. Also it's deceptive to pull Rust into this story. marcan had nothing but praise for it, without it, he wouldn't be able to write those drivers.
The main complaint of marcan is the horrible experience you have as a hobbyist Linux contributor. You can't blame Rust or Apple for that. That's on Linus, and Linux maintainers.
It seems at the heart of the issue is the vision for the future of Linux kernel.
One group believes it is Rust (progressives), one group doesn't believe that and wants to continue with C (conservatives).
If they cannot find a way to live at peace with each other, I think the only solution is for the Rust folks to start building the kernel in Rust and not try to "convert" the existing kernel to Rust piece by piece.
Why they cannot live in peace seems to be: a way that C kernel folks would not need to deal with Rust code.
At the core, the story is not that different from introducing new languages to a project.
You are introducing a new tax on everyone to pay for the new goodies you like, and those who are going to be taxed and don't like the new goodies are resisting.
It’s absurd that the C/C++ community is so hard-headed about Rust. Is the Linux kernel in fifty years still going to be written in the same language it is now?
The people in the thread were complaining about having mixed languages in a core kernel module, not about the language being Rust. Replace Rust with any other language and their complaint would have remained. The Rust people complain about not getting technical arguments against their proposals, but how is "a module that's coded in two languages is more difficult to maintain than one coded in a single language" not a technical argument?
Unfortunately it is not the first time a good developer leaves the project for a famous "Linu(s)x shitshow", and it will not be the latest...
I don't believe in the Linux project since a few years now, especially as "the bearded ones" are not interested in moving the project to a certain future, but only jerking on their old own code.
Good luck for the futur Hector, and thanks for what you managed to do until now with your team.
> I don't believe in the Linux project since a few years now, especially as "the bearded ones" are not interested in moving the project to a certain future, but only jerking on their old own code.
I personally lost my confidence in it when they stopped properly triaging security issues and flooded everyone interested with just noise.
> I ended up burning out, primarily due to the very large fraction of entitled users.
I stopped reading after this sentence. The God complex is simply too much for me to digest. TL;DR: "I am a programming God; they are newbie peons who refused to bow at my altar of greatness." On other posts about this person, I read multiple comments from people that could be summarised as: "This person is an (amazeballs) amazing programmer, but also a total drama queen." I say, with respect, "You will be missed. Thank you for your contributions." <sigh of relief>
The Rust situation was handled badly. Two languages (one of which has panics and all sorts of version issues) in one kernel are clearly not viable. Linus should have put his foot down and mandated C instead of stringing people along.
That of course was difficult in the corporate environment of 2014-2024. Perhaps he was forced to do it.
In many areas, sanity has returned, so perhaps we can get clearer messaging again in the future.
> Hi! It looks like you might have come from Hacker News.
> Asahi Linux developers are frequent targets of abuse on Hacker News. Despite our pleas, the moderators have not taken effective action to improve the situation.
> Overtly hateful content is often flagged on HN and not immediately visible. Unfortunately, when a comment is flagged and killed, its child subthread is not. That preserves the 'clean' image of the website, but the reduced moderation activity enables abuse to continue. Although you don't see those threads, search engines do. HN uniquely has a high page rank and low moderation, making it a prime target for bad actors to poison search results with abuse, bigotry, and nastiness. This isn't low-level trolling, but an organized attempt to destroy lives, including of developers in our communities.
> Please demand change within your community.
This is an unfair and gross assessment. I've lost some respect from Asahi for this.
They're calling for extreme moderation of opinions they don't agree with, which is the opposite of open discourse.
Asahi: deal with it. You're Streisand Effecting this. Your inability to handle drama is actually causing more drama. Just turn the other cheek and ignore it.
seconded. even if it was completely true criticism, which it categorically is not, putting up a half-page banner is extremely gauche and immature.
saying things like "an organized attempt to destroy lives, including of developers in our communities" is patently not true. trolls get flagged. honest nice people who don't agree with you aren't trying to destroy anything and nor do they hate you.
I'm unsure whether the most charitable reading of your comment is to assume you missed that these linked phrases exist on the original site but were not included in the text copied into the comment above, or something else:
While you may be correct that initial "trolls get flagged", the statement on the Asahi site agrees that while the initial comment may be flagged & killed, the other comments in the subthread are still indexed, visible & tend not to get moderated/flag:
"Unfortunately, when a comment is flagged and killed, its child subthread is not. [...] but the reduced moderation activity enables abuse to continue. Although you don't see those threads, search engines do."
Based on other remarks about the content of such subthreads it seems surprising to claim that follow-on comments are made by "honest nice people".
I'm as much of a fan of adverbs as the next person but using words like "categorically", "extremely" & "patently" doesn't seem to leave much room for nuance of interpretation when written by someone who I'd have assumed was a third party observer?
While I could understand someone describing JWZ's HN-tailored "banner" (I wouldn't suggest researching this if anyone is not already familiar) gauche and immature, it feels like somewhat of a stretch in relation to a plain text message who last sentence starts with "Please".
disagree, strongly. kiwi-farms has nothing to do with hn. if kiwi-farms starts brigading and spamming/trolling on hn, it gets flagged.
> the other comments in the subthread are still indexed, visible & tend not to get moderated/flagged
indexed: please complain to google.
visible: not unless you turn on show-dead. so don't do that.
don't get moderated: they are already dead.
> I'm as much of a fan of adverbs
i mean what i said. i'm extremely tired of seeing histrionics and exaggerations, misplaced blame, etc. turned into loud, unfair, criticism toward what is probably the best moderated group i can think of.
JWZ's banner is at least recognizable as satire, and his opinions are well known. i can disagree with him, but still find it a little bit funny (and immature). but if you do the same thing (yes, with a please), then you are just exactly as mature. and if you are serious, less grounded in reality and not nearly as funny.
If the opinions they don't agree with exist on Hacker News, and they do (check the dead comments in just about any thread where Asahi Linux comes up) then it isn't an unfair assessment at all.
What is the "it" that you're insisting they "deal with," here? What is the "drama?"
Also what value does bigotry, homophobia and transphobia have in open discourse that it must be preserved? None of that is on topic for Hacker News, why must it be on topic for the Asahi Linux community?
Turn the other cheek. Ignore it. It's 2025 we're learning lessons from USENET all over again and having to reign in the over-sensitive, disregulated behavior of some people.
I'm gay, on the spectrum, and my wife is trans. What certain people in "my" community do from places of relative comfort makes life for those of us in more moderate / conservative-leaning places worse. The screeching from our community [2] has turned our little demographic into a major culture war topic, and it's all because of the bad attention and friction you manufacture.
Conservatives let LGBT and trans issues slide for over two decades of my adult life. But by being loud and attempting to silence them -- by harassing them -- you've become the nail that sticks out and have now created a tidal wave of opinion against us.
It's easy for some European or SF trans person to call for universal outlawing and censoring of speech, but you have to realize your message is being read all over the world. It's interpreted by an overwhelming number of people as attempting to memory hole conservatives and flush away their culture.
Simultaneous to your harmful messages, folks are also being inundated with social media rage/engagement bait to make them think liberals are literally attempting to destroy and annihilate conservatives [3].
Your message adds weight to this perception, and all you accomplish here is making the majority of voters angry at us. It even turns moderates and would-be supporters sour.
I hate that you represent me by association and think that this is acceptable behavior.
As another anecdote, when I talk to my friends about Rust, the subject of "drama" frequently comes up. Why is that? Suddenly my work becomes harder for an entirely unrelated and unmerited reason. That's just me as an LGBT person - imagine how straight people feel.
We shouldn't have to keep reading about this over and over. It's orthogonal, childish, dysfunctional behavior.
Take one more look at that loud disgusting banner on the top of the Asahi page. That's neener-neenering in front of everyone. Even the moderates you hope to be your allies. Please, for god's sake, put yourself into different shoes. You're asking them to do it for you, but it's your turn.
I think you'll see that your behavior is also harassment.
Please calm down, slow down, and behave like adults. Not everything warrants a response or attention. Chances are, it'll just go away and get totally ignored. When you engage, you shift the conversation and bring yourselves down to their level. You create a firestorm of drama that everyone watches like a burning wreck.
Stand above that.
[1] I only wanted to talk about the very public, inflammatory resignation and the immature handling of this by certain parties.
> Conservatives let LGBT and trans issues slide for over two decades of my adult life. But by being loud and attempting to silence them -- by harassing them -- you've become the nail that sticks out and have now created a tidal wave of opinion against us.
In the US, DADT was repealed in 2011. Obergefell was 2015. The idea that they let LGBT and trans issues slide for over 20 years is fundamentally wrong and not supported by history.
I reject the rest of your post and the defense of those that would take rights away from individuals and myself because they have to be coddled.
I would like to first acknowledge the feelings of what I read to be anger, frustration & pain you expressed in your comment. (If I've misinterpreted what you've written, I am open to reading further clarification if that's something you felt like investing effort into.)
While my life experience has been different to yours, from what you've written about how you've been treated by others in your community, as a consequence of who you are, it seems understandable to me that you might experience those feelings--and, even if they didn't seem understandable to me, it is more important to me that you feel heard and your feelings acknowledged as valid and not dismissed.
I hope I have been able to communicate that intent effectively.
----
At the risk of falling into the stereotype traps of "straight white male thinks every rhetorical invitation is a literal invitation for him to say what he thinks" & "straight-splaining" I did want to provide an answer to the question in the last sentence here:
> "As another anecdote, when I talk to my friends about Rust, the subject of "drama" frequently comes up. Why is that? Suddenly my work becomes harder for an entirely unrelated and unmerited reason. That's just me as an LGBT person - imagine how straight people feel."
(I preface the following with an acknowledgement that it's bullshit that you have had to deal with the impact of this rather than the predominantly straight white males who don't want to be made to feel uncomfortable.)
TL;DR:
FWIW, from my perspective as a straight white male I feel the subject of "Public Interpersonal Conflict" attributed to Rust is directly related to values rightfully espoused/embodied by the Rust project/community/language that are at odds with values held by other groups.
Specifically, groups consisting of predominantly straight white males believe that the comfort of predominantly highly skilled straight white males should be prioritized over the physical well-being of other humans; and, also over the security and stability of the software other humans use.
They are also unlikely to agree with this characterization.
Unlike the above group however, rather than targeting resentment at the people whose physical well-being is at risk I choose to direct my resentment at the predominantly straight white males who choose to dismiss important issues as unimportant "drama" because they resent being "made" to think about issues that impact people other than themselves.
----
For anyone who disagrees with my characterization I would point out that we do not know what other contributions Alan Turing may have contributed beyond "Turing Completeness" & "the Turing Test" to current in-demand fields such as AI if he hadn't been persecuted for not being a straight white male.
I would also remind them the ARM CPU attached to that unified memory on which they're running their latest AGI & LLM models is thanks to another person some people in the present day think should be persecuted for daring to exist.
But equally people shouldn't have to trade advancements in the field of Computer Science for the right to exist without persecution.
----
I will acknowledge that its entirely understandable to want to avoid the associated discomfort because from personal experience it is very uncomfortable to have to re-evaluate one's place & responsibility in the world after a lifetime of being told something different.
----
The other ~2,500 words I wrote on the topic was certainly more nuanced but pretty much said the same thing with more beating around the bush with additional personal context.
For any straight white males who may be confused why someone might think as I do, all I can say is that time spent reading/listening to this (unfortunately, archived) resource is likely to be worthwhile, if temporarily uncomfortable: https://geekfeminism.fandom.com/wiki/Geek_Feminism_Wiki
"Although you don't see those threads, search engines do. HN uniquely has a high page rank and low moderation, making it a prime target for bad actors to poison search results with abuse, bigotry, and nastiness. This isn't low-level trolling, but an organized attempt to destroy lives, including of developers in our communities."
Funny how this aged, now. Trying to shame people on Mastodon? Totally valid. Trying to chew someone out on a private forum? Now it's an organized attempt to destroy lives.
Always funny to me to read something like that. HN has high moderation, one of the highest I've seen on modern fora, only a few steps below r/AskHistorians for example. I don't see this "abuse, bigotry, and nastiness" here and even if there are comments like that, they are quickly downvoted and dead.
That's not correct, the point is that while the comment might be flag-killed, the subsequent posts in that thread are not and are visible to search engines. For example: if you go to this post [1] from the prior thread while in private/incognito/whatever mode you can see the posts underneath a flagged comment even though you can't see the comment itself. And there are some comments there by other users that, despite being flagged, are still indexable and visible
That's true, but those comments would be flagged too if they're bad enough. Just because a flagged comment exists does not mean the entire subthread is bad.
Right, but that gets into the exact argument that the team is making. When a thread is flagged, that means fewer users are going to open that thread and flag subsequent content or flamebait. It creates a lower moderation environment where those kinds of comments can thrive, which is evident in the same link I posted where you can see what I'm referring to.
If they are flagged, how would they thrive? Fewer people are seeing them as you mention (and even then, many people do have showdead turned on and flag those comments). None of the non-flagged comments on that thread to me (at first glance on a quick skim, anyway) seem like they are "abusive" or "toxic" or whatever other word wants to be used by Asahi it seems like. Indeed, people are pointing out that you can't censor opinions on a forum that you don't control such as HN.
As a general point I agree that it would be better that a flag disables replies to the entire subthread (although it shouldn't [dead] it), just because 99% of the time they're just not good discussions. However, the claim that this is somehow "destroying lives" is rather unserious. Whatever may or may not be going on on Kiwifarms has little to do with HN, and the occasional idiotic comment on HN is ... just the occasional idiotic comment on HN. There are also not really that many of them.
Also, I'll add that whenever I've seen an unflagged hateful comment I've emailed hn@ycombinator.com, and the success rate in getting the comment killed and people told off (or banned) is thus far exactly 100%. This usually happens if someone leaves a comment a few days after the discussion dies down, so few see (and flag) it.
HN is pretty low moderation across the axis of personal attacks. If you politely say a ad hominem or racist or *-phobic thing here, it's unlikely you'll be moderated for it, for instance.
I can dig up many such examples, but I suspect the response would be, "of course that's not moderated" because this community has a different set of values than some others.
Moderation is always an editorial action, and as such we tend to view it as strong when it aligns with our own values and weak when it doesn't.
That doesn't match my experience of what we do, so I'd like to see those "many such examples".
IMO, if you're going to make charges like this, which would be serious if they were true, you should include links so readers can make up their own minds.
My methodology:
Search for any of the following terms: woman, biological, Black, Latino, gay, trans, woke, dei, or virtue signal
Set to "Comments" and "30 days". You'll find plenty of people saying things that are pretty awful. Yes, they are not the majority of posts, this place isn't a cesspool, it's just a place that permits "just asking questions" or "it's up for debate" as a defense for behavior
Of the remaining 3 of the 10, I disagree with you about saagarjha's comment: https://news.ycombinator.com/item?id=42907076. That one seems thoughtful and in keeping with the site guidelines. It does use a lot of sort-of trigger words (I counted "trans", "vegan", "left wing", "Democrat", "progressive", "conservatism", "Republican"), but surely we're not going to punish people just for using words like that.
The other two seemed borderline to me, although I confess that one was so long that I couldn't read it before becoming le tired.
> I could find many, MANY more examples.
I'd be interested in seeing them, and I hope it's clear that I mean that. I don't want to argue about this—I want to see what you're seeing.
I think saagarjhas comment is probably a transcription error on my part, copying the wrong link out of a thread that had questionable stuff in it.
I'll dig up some more. They tend to be a bit stochastic, and on various topics. (There was an article that made the front page awhile back written about pg that was written by a trans woman that was an absolute lightning rod for this, iirc.)
I don't think HN has high moderation at all. High moderation would imply stricter and quicker punishment for making rancid remarks.
There were a number of remarks on the prior thread by people making conspiracy claims, harassment, insults etc. Some of them get flag-killed, some just down voted but ultimately the users on the site still remain.
Of course I'm not one to be above such a thing in terms of insulting people occasionally but HN is really quite permissive in terms of what you can post and get away with. It takes consistent and repeated bad behavior to get a warning, and even more to get banned. And if you're an expert in being politely venomous you can get away with even more. That's why the outside perception of HN tends to be a lot worse than the inward one.
>M2 upbringing was like 10 hours onstream, M3/M4 would not take any longer.
You have NFI what you're talking about. There were major architectural changes in M series chips between Avalanche/Blizzard (M2) and Everest/Sawtooth (M3).
That is not true. Apple claims this for PR purposes, but Marcan and Lina said on stream that the changes are not that big at all - Apple does not completely change their architecture ever. There may be bigger changes on the GPU side, but that does not affect the base upbringing (earlier Asahi versions did not have any GPU support at all, only software rendering) - also that would be work for Alyssa, not for Hector.
> Then 2024 happened. Last year was incredibly tumultuous for me due to personal reasons which I won’t go into detail about. Suffice it to say, I ended up traveling for most of the year, all the while having to handle various abusers and stalkers who harassed and attacked me and my family (and continue to do so).
Regardless of whether hector is asahi Lina or not, the whole vtuber doing work that's important to a lot of people has a kinda off vibe and makes me a bit uncomfortable.
> Regardless of whether hector is asahi Lina or not
This isn't up for debate. There's tons of evidence out there, including the stream where his VTuber software failed briefly and he "doxxed" himself. It's not a fake. I was there watching, and rooting for him to succeed.
Rather than further clutter up this thread with the same links, yet again, I refer you to:
He likes C, hate c++, allowing rust in the kernel is trying something new, and maybe to attract new maintainers that don't care too much about C, and as he said "maybe it work, maybe don't, if don't we learn"
Trying to push Rust code into Linux and after a year calling it a "failure of leadership" when people are reluctant don't welcome your code with open arms is not very professional. It has taken other projects decades of out-of-tree maintenance until their code finally got in (RTLinux).
The correct way would have been to maintain the Rust code out-of-tree, for as long as it would take, which would also somewhat prove that you are ready to maintain that code over a longer time period.
Sad that this led to him stepping down, but maybe others in the Asahi Linux circle are ready to keep maintaining the code out-of-tree until everyone is ready for it
I checked into Reiser4 and they are still indeed working on that out of tree. Hats off to people who stick to it, though I think renaming the file system would have been a smart idea.
That's the only other project I can think of that's out of tree besides the usual suspect (ZoL)
This sounds so rough. I can't imagine pouring your heart out into this labor of love and continue to have to face something like this. Back in the early days of Quora, when it used to be good, there used to be a be nice be respectful policy (they might still have it), I wonder if something like that would be helpful for open source community engagement.
Regardless, major props to Marcan for doing the great work that he did, our community is lucky to have people like him!
reply