Hacker News new | past | comments | ask | show | jobs | submit login

I like Rust and I'd love to see it in the Linux kernel, but not before the GCC+Rust issues are all fixed. You shouldn't need non-GCC compilers to compile Linux and from what I can tell there isn't any GCC Rust compiler that's fully equivalent to the standard Rust compiler just yet.

They say that Rust support is optional, but when the first big driver gets written in Rust it'll become mandatory. So either the Rust integration will fail and nothing important will get written in it, or it won't be optional. I'm not sure what comfort that's supposed to bring.




The GCC story isn’t that far behind - there are two complementary efforts to get GCC support into the Rust ecosystem.

One is a Rust frontend to GCC (https://github.com/Rust-GCC/gccrs) while the other is a GCC backend to the existing Rust compiler (https://blog.antoyo.xyz/rustc_codegen_gcc-progress-report-12). It might take a while for these to bear fruit, but it’ll be awesome when they do. That should address the very valid concern that the full Linux kernel won’t compile for all supported architectures.

And it’s not really a problem that the GCC effort will take time. The headline of this article implies that the Rust integration is nearly done but it’s only the beginning. Based on the issue tracker there’s years of work left to get everything working perfectly. That’s enough time to get GCC supported well.


> It might take a while for these to bear fruit, (...)

That's the problem. There are no GCC-based Rust compilers, nor is there a fixed timeline for one to be delivered, specially in the short term.

Therefore, as the OP pointed out, as things stand either Rust in the Linux kernel is dead in the water or it poses a major problem for the Linux community.


I don’t know who this “Linux community” is that doesn’t include Linus or Greg or other senior developers that have been working with the Rust for Linux folks for more than a year. They don’t think it’s “dead in the water”.

Let’s assume for a second that Linus knows what he’s doing. Now look at his mail from April 2021 on what he views as show stoppers (https://lkml.org/lkml/2021/4/14/1099) - OOM panics and 128-bit floating point. The former has been fixed now, which is why Linus appears bullish on adding Rust support to the kernel. Check that mail for the number of times Linus mentions GCC - 0. It’s not a priority for him.

It’s easy to see why. He (and everyone else involved) are very clear that this will only be used for driver code on architectures that were already supported. So therefore, there is no need to supply a GCC compiler in the short term.

I pointed out two long term efforts to get GCC Rust support. These could unblock more widespread use of Rust in the kernel, but that’s jumping the gun. Let’s see even one real driver merged before thinking about that.

Whatever happens, the Linux developers won’t let the project break for millions of users overnight. That’s not how they operate. Your “Linux community” should know that.


The initial work is just scaffolding that allows using Rust for some drivers, and projects intended to test said scaffolding. (Basically, rewrites of some drivers that are already in the kernel but that are not considered to be of the greatest of quality.)

It will take quite a while yet before something important is made in Rust, and this early work can be done in parallel with the GCC-Rust work.


There is talk of the Apple Silicon GPU driver being written in Rust, which I would say is non-trivial!


Wow, I hadn't heard that. But ofc this doesn't limit platforms because the hardware is only available on ARM64.


Yup. And if you have to use the rustc compiler that means you cannot use the rustc compiler from your repositories. Rust changes in forwards incompatible ways so fast that rustc is literally out of date and unable to compile new Rust code in less than 3 months. It's not because Rust is inherently bad, it's just that the type of people who write in rust are bleeding edge types (for now) and have no consideration for forwards compatibility.

They just assume you'll curl | rustup.whatever and install a new version of rustc every month from a third party website like every single Rust guide suggests. That's not optimal for the linux kernel. But maybe kernel dev types won't be as bad as the typical bleeding edge rust dev.

And given that rapid rate of change in Rust, I don't see how GCC can ever keep up. Maybe in 10 years when Rust is more stable.


Rust has language versions and you can specify the language version to use in your Cargo.toml file. This is possible today and it shouldn't pose a problem for the kernel at all as long as there are rules about what versions to support.

You don't need a super modern version of the spec to use Rust.

As for Rust's incompatibilities, as long as you stay clear from methods marked unstable and the nightly compiler versions you'll be fine. Rust's libraries have breaking changes, but the stable language features haven't had any since the 1.0 release as far as I'm aware.

If your problem is forwards compatibility, well, you can't use clang10 to compile the kernel either. Sometimes the version gets bumped, but it doesn't need to happen too often. I think GCC's rust compiler will keep up just fine.


Half of the ecosystem depends on the nightly compiler.


I dont't think "half" is accurate, at least not anymore. A lot of things have moved from Nightly to Stable compatibility with stabilization of async, for example. Rocket builds on stable now, while for a long time it didn't.

I'd like to see the actual percentage but AFAIK it would take some analysis. Crates.io shows 86417 total crates, searching for "stable" shows 3346 and "nightly" shows 2527... So that's not a good heuristic, probably.


It’s impossible to deep link into a PDF, but the raw survey data from last year is out: https://blog.rust-lang.org/inside-rust/2022/06/21/survey-202...

It shows that the vast majority of people use latest stable. A decent chunk also use nightly, and the reason why “a crate dependency I need requires it” is pretty far down the list of reasons.

Oh and a pet theory: the way you feel about this is correlated with the kind of work you do. I’d you’re writing an OS, you probably use nightly still, and so your dependencies probably do too. But if you’re higher up the stack, where unsafe is less common and the appropriate language features are more filled out, you probably never use nightly. This makes talking past people unfortunately easy to do.)

(n.b. I’m glad they released the raw data, because otherwise we wouldn’t be able to share this, because it wasn’t included in the blog post this year. I brought this up before publishing but my advice was ignored, oh well!)


I believe many browser-embedded PDF readers now support `page` fragments [1]: https://raw.githubusercontent.com/rust-lang/surveys/main/sur...

[1] https://datatracker.ietf.org/doc/html/rfc8118#page-4


Oh fun! Thanks.


Back in the days, before deep links... we'd just say the page number.


Parent is saying the proportion is very high. It's not the exact number that matters. Even if one dependency your project uses needs nightly, then your entire project has to use nightly.


Less than 20% of devs use nightly. About a third of the devs who use nightly do it because of a dependency. 6-7% of Rust devs is a far cry from the “half” claimed by GP.

https://github.com/rust-lang/surveys/raw/main/surveys/2021-a...


> Less than 20% of devs use nightly.

You tried to make that sound like few Rust developers use nightly, but the same survey you linked to states that, while 7449 said they use the latest stable version for local development, 3333 said they use nightly, with 2853of them using the latest nightly and 481 using a specific version of nightly.

All in all, the ratio of Rust developers targeting nightly releases or betas represents something in between a third and a quarter of the whole developer community.

That's a lot of people in the Rust community delivering code targeting unstable and unmaintainable releases.


> unmaintainable releases

How did you come to this conclusion? Seems like it must be working for them, because most of them are using nightly out of choice.

Plus it’s a service to the community. Their everyday testing helps make the stable releases better. Imagine if no one used nightly - that would be much worse.


This is true, but you'd have to show that those nightly dependencies are relatively central to the Rust dependency graph (i.e., that lots of other things depend on them). To the best of my knowledge, that's not really the case (the really central dependencies in the Rust ecosystem are things like serde, libc, anyhow, etc. that all use the stable compiler).

In my experience writing Rust professionally (including making heavy use of dependencies), I've only needed the nightly compiler once. And that was for our own code, not an external dependency.


That used to be true a couple years ago, but I don't think I've used a nightly compiler version at any point in the last 12 months.

That said, I think for this first phase the kernel is planning on a specific stable version of the compiler with nightly features enabled, because they have a couple specific needs that haven't made it through the stabilization process yet.


I've not experienced this. I know it used to be very significant, but on my latest (and biggest) project that I've been working on for ~a year, I haven't run into any nightly-only packages that I couldn't easily do without.


Almost none of that ecosystem is relevant to kernel.


> Almost none of that ecosystem is relevant to kernel.

What exactly leads you to believe that?

From the Rust 2021 survey, 2420 said they use nightly to access one or more language features they need, 1028 claimed a Crate dependency requires it, and 876 claimed a tool they use requires it.

This, from a sample pool where around 3k users out of around 10k claim they use a nightly version.


Again. Almost none of these can run in kernel space, those that are required will be duplicated in tree.

The survey results are not relevant. The rust code must not use stdlib (the name escapes me) which uses an incompatible memory allocation mechanism.

The code used to build will likely need to be checked into the kernel git repo and not downloaded live online, I doubt that the mature development process of the kernel wants the problems that other package managers have.

I will fight to ensure that this is not the case in the kernels that I work with, because fetching dependencies is for people who can tolerate other people and networks being faulty.

I got shit to do.


Because the kernel code isn’t pulling dependency code from the internet OR a compiler that changes every day. Kernel code will be checked into the repo, dependencies if any, will be vendored. Nor will much of crates.io be relevant - the kernel will only use no_std with fallible allocations.

As for the compiler, it will use a pinned, stable compiler, not nightly.

Why are you spreading FUD about nightly + Linux on this thread? At no point has this combination ever been considered.


Would the kernel pull in any dependencies, or just reimplement everything like it currently does?


It doesn't use cargo but it does use the core and alloc crates. Some parts such as Mutex are reimplemented for interoperability with C code.


> it's just that the type of people who write in rust are bleeding edge types (for now) and have no consideration for forwards compatibility.

Feels like a pretty blanket statement. I'd assume that they'd base the minimum version on Rust's editions. An edition is released every 3 years and is meant to be a solid LTS (as far as I'm aware) version of the language. If they use Rust 2021 edition, you can tell the compiler that (like C89 vs C11) and it will reject code that's newer. C has plenty of newer stuff too that isn't allowed in kernel code at the moment as well.


Editions aren't that clear-cut- crates targeting old editions can still use most new language features, and all new library features. For example, the NLL borrow checker was originally enabled in the 2018 edition, but it's now available in 2015 as well.

Editions are more designed to manage backwards-incompatible changes, like new keywords or tweaks to corner cases in the type system.


Rust editions are nothing like C versions. They're orthogonal to language version. They're more like C trigraphs or -W flags.

All new Rust features are available from all Rust editions. The 2015 edition has got new features 3 weeks ago, and will get more again in 3 weeks.

Old Rust editions are only for backwards compatibility with old source code, e.g. to let it use names that became keywords later.


> Rust changes in forwards incompatible ways so fast that rustc is literally out of date and unable to compile new Rust code in less than 3 months. It's not because Rust is inherently bad, it's just that the type of people who write in rust are bleeding edge types (for now) and have no consideration for forwards compatibility.

Funnily enough, I'm using the rustc compiler from Debian repositories (who of course is not well-known for eagerness to adopt bleeding-edge), and I've not run into any Rust code that wouldn't work with that compiler.


That's because you're not specifically using programs built with brand new rust features and then complaining that you have to update the compiler to use them.

You know, like a sane person.


> Funnily enough, I'm using the rustc compiler from Debian repositories (who of course is not well-known for eagerness to adopt bleeding-edge), and I've not run into any Rust code that wouldn't work with that compiler.

Oh hey, there's two of us ;-)

My shoulders relaxed significantly when I also put Cargo in offline mode, pointed it to the directory where Debian installs its librust-*-dev packages, and stopped downloading rapidly changing stuff off of crates.io.


I installed Debian 11 before it was released officially. Its rustc was less than 3 months old. I personally ran into 2 rust programs I couldn't compile due to compiler forwards incompatibility within weeks. One example was the software defined radio program, plotsweep, if you want to look it up in cargo.


That's an application, not a library though. There's really no reason for an application not to require the latest compiler version if they so choose. There are definitely libraries that take this approach too, but they tend to be higher-level domain-specific ones.

The foundational libraries in the ecosystem tend to take a much more conservative approach to MSRV (minimum supported rust version), supporting older Rust versions for as long as possible, and only bumping the version when new features that significantly improve the library are released (and often only once the versions implementing those features are packaged in stable releases of major distros).

For example, the `rand` crate supports Rust versions down to 1.36 (released in July 2019) [0], the regex crate supports down to 1.41.1 (released Feb 2020) [1], etc.

[0]: https://github.com/rust-random/rand#rust-version-requirement...

[1]: https://github.com/rust-lang/regex#minimum-rust-version-poli...


You're saying no true Scotsman will require the latest rust compiler. I'm saying that's not how it is for non-Rust developers trying to use things written in Rust.


I'm saying the really core libraries (the only ones the linux kernel would ever consider using) don't suffer from this problem. If your development environment is permissive enough that you can pick up a wider variety of libraries, then I can't see any reason not to use the latest compiler.


So we agree. Programs written in Rust don't last more than a couple months. It's just the core libraries that are decent.

Btw, I just attempted to install another rust program (legdur), and guess what, my rustc is too out of date to handle it ("failed to parse the `edition` key, this version of Cargo is older than the `2021` edition, and only supports `2015` and `2018` editions"). So that's 3 of 4 attempts to run Rust programs that have failed. Granted, my rustc is now about a year old, but that's still absurdly short.


IMO, this is a nonsense standard. On the one hand, you have an old compiler. On the other hand, you have a new program. Expecting an old compiler to build a new program isn't exactly reasonable, although I grant it depends on taste and the time interval. A better comparison would be to try compiling an older version of the program.

Basically, if you're in environment where you can't or won't update your compiler to something more recent, then why do you expect to be able to use recent programs? Why not use older programs in line with your old compiler?

This is what I don't get about the Debian/Centos folks. They specifically use a Linux distro that gives them old software and then complain when they can't build new programs. Well, if you're using Debian/Centos, then you're committed to old programs. So build old programs, not new programs. Either that, or go install 'rustup' and bring in a newer compiler.


> legdur First commit 2 months ago, started with edition 2021. https://hg.sr.ht/~cyplo/legdur/browse/Cargo.toml?rev=ca11815...

Have you tried compiling something less than bleeding edge, with a year old compiler, or are you picking projects specifically to "showcase" the supposed failings of the Rust compiler?

Many libraries in the ecosystem have a MSRV (minimum support rust version) guarantee, with compile-time shims to enable newer features if a more recent version is detected.

You can pin your dependencies to those versions (and if they don't have an explicit MSRV, just pin it to a version by date or by running https://github.com/foresterre/cargo-msrv on the project to find the effective MSRV).

You can cargo install specific versions of a binary crate, and if they move to the 2021 edition, or use a recently stabilized standard library function or w/e, you can simply choose to install a specific version, that would work with your distro's rustc/cargo.

I'm not even talking about the completely valid, but last resort strategy of many non-bleeding edge distro package maintainers, of simply creating a .patch file and applying it. In legdur's case, --- edition = "2021" +++ edition = "2018" on Cargo.toml would probably do the trick. For libraries/binaries you control, you can use https://doc.rust-lang.org/cargo/reference/overriding-depende... and https://github.com/itmettkeDE/cargo-patch.

Giving up after the first minor roadblock and crying bloody murder is intellectually lazy.


> Programs written in Rust don't last more than a couple months.

They last just fine much longer than that, you may just have to upgrade your compiler if you compile from source (doing so is a single command, takes about 30 seconds max, and new versions are backwards compatible so code targeting older compiler versions will still work). Importantly, you generally do not need the Rust compiler at all if you only wish to run Rust applications. You can usually download a pre-compiled binary distributed by the program authors. Applications (not libraries) requiring a recent compiler / language toolchain and updating as and when is convenient to them is hardly unique to Rust.


Alternatively the sheer weight of the Linux kernel's influence means that the GCC implementation of Rust becomes the dominant one, altering the way people install and manage Rust.

Rust is good, but I also do not like the trend in every language and/or build system having its own package management with its own third party dependencies and its own custom installers separate from the distro. I understand the pragmatic problems this solves, but I think it has serious ramifications for reproducibility, security, auditing, etc.


The reproducibility story with system package managers is an absolute nightmare.

In general, language package management systems, containerization and fully statically linked distributions (appimage, flatpak, snaps, etc.) are all a direct response to system package managers being woefully inadequate in all sorts of ways.

And unless something like Nix takes over which actually allows reproducible builds, we'll continue to see alternative solutions continue to encroach on system package managers.


IMO distros shouldn't be packaging the world anymore, anyway. Base system from the distro, everything else from a separate, probably cross-distro package manager.

The trouble, of course, is that this is hell for a Linux desktop system, because you can't cleanly put the GUI stack in either "base system" or "everything else", especially with all the shared libs and such.


Absolutely, I understand why these systems are there. And see the issue. The problem is the rapid propagation of them. There's too many, and they live outside the purvey of the distribution management, and often (... NPM, etc.) have poor auditing and moderation.

I see why they happen. I don't blame the people to make them. I just think it's a potentially awkward situation when it comes to something as core as the kernel; which has traditionally only required GCC.


Rapid propagation? Fully statically linked binaries have been a thing for decades. The practice only got broken under most Linux distros when glibc decided to make it impossible to be linked against statically.


> I just think it's a potentially awkward situation when it comes to something as core as the kernel; which has traditionally only required GCC.

Clang is starting to leapfrog GCC here. IIRC, some features are better supported and/or more advanced on Clang, other features (e.g. LTO) are exclusive to it. Not to mention it's almost a certainty for Clang to land kernel PGO support long before GCC, in fact Google already uses it in production Android kernels.

Anyway, rust or not, linux is being decoupled from GCC.


LTO is not exclusive to clang. "thin" LTO is. It just happens to be more commonly used, since it's faster and uses less memory.


> LTO is not exclusive to clang.

This is in the context of the Linux kernel, which only mainlined Clang LTO support in 5.12. AFAIK, GCC LTO has yet to be accepted.


Rust actually works fine with distros. See for example https://github.com/kornelski/cargo-deb and https://wiki.archlinux.org/title/Rust_package_guidelines

I use Arch Linux and Most Rust programs I use are installed from the Arch repositories or AUR. Rust packages are very well integrated with the distro, they depend on distro packages and have other packages depend on it. As far as the user is concerned, the Rust build system is just a developer-only stuff like CMake or autotools or ninja or whatever.

Anyway I would like to point out that C++ also do something similar to what Rust libraries typically do, which is to use header-only libraries that don't appear as separate distro packages. It's as if every Rust library meant to be used by Rust programs (as opposed to libraries that expose a C API that can be called by other languages) were a header-only library. And this is actually great because Rust (like C++) monomorphizes generics, that is, if you call a generic function defined on another crate, the compiler actually generates a new function just with the type parameters you supplied, and there's no way the library can know upfront which generic instantiations will happen over all programs that use it.

On the reproducibility front, I think it would be great if C program actually did what Rust does and pinned the exact damn versions of all libraries they use (like Cargo.toml does)


Let's say for the sake of argument that I whipped up some software over the weekend. It's 9pm Sunday night and I want to publish my work so other people can play around with it. My friends being true nerds all have different opinions on which distro is the best.

Let's consider my alternatives. 1. I can (and will) publish the source and give instructions on how to build it. This works but is a bit inconvenient for the user. They have to manually install and manually check for updates and manually install updates. 2. I can publish on language-package-manager. This is a simple and easy process. It can be figured out in less than 30 minutes. For the user they put the name of my project into a config file and that's it. 3. I can publish on top-5 system package managers. This will take days or maybe even weeks of labor.

So basically, it's just not viable for hobbyists. It can only work for companies that pay people a salary to deal with it.


The problem of package managers is that are too many in too many OSes (and most varied on linux).

Having one for Rust reduce a lot of complexity (Same operation across all OS), and is the best place to take decisions (Rust developers know Rust needs, debian ones who knows?).


Sympathetic, but isn't Linux like the poster child for demanding vendors and whoever maintain drivers and stuff without just leaving it working for months to years? Rust is just more bleeding edge than Linux then (which might be a substantial difference, I don't know).


Shouldn't we wait for Rust to be more stable then? I still fear that its restrictions are a bit of a fad. If it is included at one point you will hardly ever get rid of it again.


>You shouldn't need non-GCC compilers to compile Linux

Huh? What relationship the Linux kernel and GCC has?

They're independent projects. It might be handy/nice/convenient to not need non-GCC compilers to compile Linux, but it's not like it's some license requirement or project obligation on behalf of Linux


GCC supports quite a few more architectures than llvm. Kernels for these archs would not be able to use Rust in the current state. So, this will limit rust to platform specific drivers until either llvm is at parity for gcc for archs that linux is available on, or gcc's rust front-end is fully capable of compiling any rust code in the kernel.

In addition to the large maintenance effort for llvm to support so many additional back-ends, there is a non-negligible number of folks who would be very opposed to being forced to use a non-gpl compliler to build a gpl kernel.

https://en.wikipedia.org/wiki/GNU_Compiler_Collection#Archit...

https://en.wikipedia.org/wiki/LLVM#Back_ends

https://en.wikipedia.org/wiki/List_of_Linux-supported_comput...


It's a weird phrasing, but I agree on the compiler topic, maybe for other reasons: an important project like Linux should not really on a single compiler. I think it is good to have the ability to use either GCC or LLVM to build the kernel. That ensures that you don't accidentally rely on a compiler bug, no lock-in, ... .

For C and C++ code you want to be able to build with different compilers.

I think that also makes sense for Rust, the language and the compiler should be two separate things and there should be multiple compiler suppliers.

Are there arguments why this would be less relevant for Rust than C?


Clang has been a viable compiler for only about a decade--a third of Linux's existence--and could itself build Linux only a few years ago. Prior to clang's existence, Linus is on record as saying that he didn't care about compatibility with other compilers, which at the time would have been something like Sun's C compiler.

So it's very much against the historical policy of the Linux kernel to not be tied into a single compiler, even if it is a welcome change in policy.


Moving Linux and FOSS away from GCC runs the risk of enabling the mass adoption in industry of LLVM/Clang and the drying up of open source commits to either project once Clang is in a place where it does not generally require community contributions.


It should be rephrased to "Rust support in Linux shouldn't require two separate compiler toolchains", e.g. either the Rust toolchain can also compile the C parts, or the C toolchain can also compile the Rust parts (can Linux actually be built with a different C compiler than GCC yet? I thought it relies on specific GCC behaviour?)


This is not an equivalent rephrasing. Although I make some pragmatic compromises every now and then, I think the GNU mission is important, and historically Linux has been an important part of the GNU system. GCC's copyleft license is a big ideological advantage for me, and, although I like the option of using LLVM, I think being forced to use LLVM is a step backwards.


It can be built with llvm too; there was a lot of work put into llvm to support gccisms.


Linux has long been written to be compiled specifically with gcc, including the use of GNU extensions.

If other compilers implement compatible features such that they can compile the kernel, fine. But gcc has been the supported compiler.


I don't see why Rust usage in the kernel should be blocked by GCC of all things. Of course kernel has been traditionally compiled by GCC, but there was never a ban on any additional or alternative tools. There is a number of dependencies on things like make, bison and so on.


You would care if you were on one of the dozen or so CPU architectures that would suddenly become unsupported.


Probably the Rust code will be drivers for devices that are not compatible with non-rust supported architectures anyway.


If the drivers are for PCI or USB devices, then they would have probably worked for those unsupported architectures "for free" if they were written in C.


By the time these kind of drivers are being written in Rust, the gcc support will be there anyway.


Also true, but there's a precedent for Rust developers & shipping their changes without regard for what's being deprecated. [1]

[1]: like when PyPI cryptography first switched to Rust and broke ansible/openwrt/etc.


The Linux kernel takes backwards compatibility much more seriously than your typical Python package.


The Linux kernel community and Rust community still aren't on the same page on things yet, as you'll learn by reading lwn regularly.

We're just not there yet.


Seems like Linus and Greg and a bunch of people who oversee technical decisions on the Linux kernel are on board though?


And they can back-out/revert if they decide to as well


ansible... btw. ansible breaks basically tons of stuff every release, so no matter if the pypi crytography broke ansible. ansible is already broke. heck they renamed their package in the most stupid way ever...


I thought we are talking about Rust? Full official Tier 1 support for aarch64 and x86, why would you even want more? That is two, and let me say it again TWO fully supported CPU architectures, that is twice as many as your average computer needs. Nobody needs dozens, certainly not someone who wants to be a Rockstar Rust dev. . /s


There are lots of cases where you want more. For example Google's secure boot for Pixel uses the Titan M security chip, which is a RISC-V architecture.

Still, looking at https://llvm.org/docs/CompilerWriterInfo.html, it is hard to find an architecture that Clang won't support that is relevant to many people. And if someone does find one, well, Clang is open source and open to adding more architectures.


Secure boot for smartphones is one of the reasons why smartphones are so restrictive and oriented against the user instead of for him. They should drop the support.


The majority of microprocessors deployed in the world are using neither of these platforms. Some of their developers like having a featureful compiler for them.


* The kernel doesn't need all those new bells and whistles that Rust wants to offer in the future, at least not as fast as how they are being shipped.

* The kernel requires stability. Building a kernel w/ rustc is like building a house on a flying bullet.

* GCC is insanely good at porting to other architectures, something vendors have been relying on for a long long time.

* Consistency in applying compile flags and optimizations.

Idk, these are all I can think of right now.


> The kernel doesn't need all those new bells and whistles that Rust wants to offer

That's a bewildering statement considering that one of the bells and whistles that Rust offers is memory safety.


I mean the ones in the unstable branch. Stable Rust is just good enough, as it can perfectly reproduce the C-like coding experience with added type safety.


> least not as fast as how they are being shipped.

Could you give some examples of big features from the last 12 months? I can’t think of any big ones off hand. IIRC, just a large number of small improvements.

> The kernel requires stability.

I can’t recall a new Rust release ever breaking existing code. Is this what you’re talking about?

> GCC

Rust will have a GCC backend and/or frontend soon.

> Consistency in applying compile flags and optimizations.

Can’t say much about this but I’m fairly sure the kernel devs won’t let anything be merged until they’re satisfied. We can trust them on this.


> I can’t recall a new Rust release ever breaking existing code. Is this what you’re talking about?

I have one case of code from late 2015 that stopped working in, IIRC, 2016, due to unsoundness being found in the borrow checker and it being fixed.

  error[E0713]: borrow may still be in use when destructor runs
     --> src/writer.rs:121:21
      |
  47  | impl<'a, F: Write + Read + Seek + 'a> Writer<'a, F> {
      |      -- lifetime `'a` defined here
  ...
  121 |         Reader::new(self.file)
      |         ------------^^^^^^^^^- returning this value requires that `*self.file` is borrowed for `'a`
  122 |     }
      |     - here, drop of `self` needs exclusive access to `*self.file`, because the type `Writer<'_, F>` implements the `Drop` trait
I ended up solving it by wrapping `self.file` in an "unnecessary" `Option` and using `Option::take` on `self.file` here. I guess I could have mucked around with `MaybeUninit` and risk UB instead, or removed the convenience `Drop` impl finalizer or made the impl non-generic.

I haven't had a case of code no longer compiling since then.


You're not doing engineering here, bro.

> Could you give some examples of big features from the last 12 months? > ... > I can’t recall a new Rust release ever breaking existing code. Is this what you’re talking about?

Let me show you two links:

1. Rust version history: https://github.com/rust-lang/rust/blob/master/RELEASES.md

2. GCC changes: https://gcc.gnu.org/gcc-12/changes.html#c-family

Compared to Rust, GCC, the C compiler, is absolutely boring. The C language has been stable for decades, so there's hardly anything to be done on the language itself. GCC has been used in all kinds of occasions, and has compiled all types of applications and system software. We know it works in every situation. There's hardly anything to be done. This is how the real stability looks like - boring as shit.

Now, what, 12 months? The number isn't simply getting there.


Or that then gives people the motivation to work on gcc, have to solve the chicken-egg


This should really motivate the rust frontend for gcc though


> You shouldn't need non-GCC compilers to compile Linux

Why?


Do you really need to fix ALL the gcc rust issues ? The Linux kernel could disallow rust features that are known not to work in gcc


Okay but think of it this way... maybe this is creating more demand for a GCC Rust compiler. Maybe it finally gives some people the excuse or funding to do what they need to do.


Is there anything in the Linux kernel that can't be compiled with clang? There is plenty of experience with building BSD variants using it.


Historically, the answer to that was "yeah totally." But folks have put a lot of work into getting clang to build the kernel, and IIRC Google has been building their Android kernels with clang for a few years now.

~~It's not~~ (see below) It wasn't supported directly by upstream, and in fact historically was pretty much the opposite: the intention was to not care about portability between compilers at all.


I think it is fully supported now: https://docs.kernel.org/kbuild/llvm.html


Thank you! That's more than I knew of.

(I do think that what "support" means is kind of amusing, I would also say that this counts as support, but above you have people arguing that only Rust's Tier 1 support is called "support"...)


It's one thing to say developers have to add a new compiler front end for some drivers if they want to compile them; and, that some developers will just have to add it regardless.

It is wholly another to say that they need adopt a whole different code-generation regime. It adds risk, and adds to what they need to understand to know what they are building. A pro needs to understand where problems come from, and be equipped to dig down to them.

So, OK for hobbyists, but a burden for pros. It is the difference between "neat project, might succeed" and "mature". Rust stands a good chance of becoming mature in a few years, more than can be said about almost everything else.


We've known for some time now that the future would be written in Rust. If you do not have a Rust compiler in addition to a C compiler on your system, now is the time to consider installing one.


Nope, everybody knows that the future will be coded in Jakt.

Jokes aside, I bet Jakt (a completly new language) has introduced fewer language and compiler changes in the last 3 months than Rust (5-10 years old now?).


Jakt has introduced far more language and compiler changes in the last 3 months, don't be silly. Like the entire thing has been written in Rust first in the last 3 months, and then was rewritten in Jakt. Those are massive changes!


But how many language spec changes have there been during this time?


Probably all of them? Jakt repo isn't even 3 months old.


Jakt still relies on Rust to compile the current standard of the language and the language spec is frozen until the Jakt compiler becomes self hosting. Even so, the spec is far from finished and the language will probably change weekly for weeks to come once the compiler has been finished.

Rust's compatibility problems are quite irrelevant in my opinion. You can pin a language version and future compilers will still run your code as long as you don't use any explicitly unstable features (so no nightly compilers and no manually enabling unstable features). Yes, Rust moves faster compared to C's glacially slow language development, but almost every language does these days. Even C comes out with new versions every few years and C compiler versions get deprecated all the time.


> You can pin a language version and future compilers will still run your code as long as you don't use any explicitly unstable features

Rust's editions are not language versions. They set aside syntax, so far there was 2015 edition (the original Rust 1.0 syntax) and then 2018 edition (a few tweaks, introducing the raw symbol which enables you to mention symbols that conflict with keywords) and most recently 2021 edition (which adds a hack to the older editions so that arrays don't seem to be IntoIterator in 2015 or 2018 edition even though they actually are now).

C++ ships language versions on its three year cadence. C++ 14, C++ 17 and C++ 20 are three very similar yet distinct languages while Rust's 2015 edition, 2018 edition and 2021 edition are simply syntax for the same language, Rust. You can write brand new code, today, in 2015 edition, and use features that didn't exist six months ago, no problem. You just can't use keywords (like async) that hadn't been invented in 2015 and thus you can't directly use features gated by need for those keywords.

MSRV (Minimum Supported Rust Version) is a different idea, it says this software needs some features which weren't in Rust until this version. For example if you write 2018 edition, that doesn't magically mean you used async, but on the other hand, if you did require async then no stable Rust versions from 2018 had working async, the keyword was reserved but not useful. So your MSRV might be much newer even though your program syntax is 2018 edition. Documenting MSRV is not a mandatory practice, although it's good to do it for important stuff or where the minimum version is likely to be a surprise.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: