So we're gonna be stuck writing a precambrian prototype language till the end of time because there's so much legacy code already written in it? Never seemed to stop people moving from Pascal, or Perl or literally all other languages that are now obsolete.
I really hate how for microcontrollers the only two choices are either C++ or Micropython, I mean how about some fucking middle ground instead of two polar opposites? At least eventually everything will be rewritten in Rust I guess.
> I really hate how for microcontrollers the only two choices are either C++ or Micropython
Why wouldn't you just use C for programming a microcontroller? Sure, it's not a great language for web backends, but microcontrollers are where it shines. You're probably not deploying 100,000 lines to a microcontroller for a personal project, so the lack of certain abstractions isn't going to be that painful. On the other hand, C lets you make the latency and memory usage 100% predictable, which can be a great asset.
Why wouldn't you use assembly for programming a microcontroller? Sure, it's not a great language for web backends but microcontrollers are where it shines. /s
Because as the OP states, it's an objectively (pun intended) terribly abstracted language. There is nothing 100% predictable about C except that you'll eventually get screwed because you didn't account for some random obscure thing that should never have even been possible to do. Any language that allows using static variables can have predictable memory consumption. There is nothing inherent to it that makes it better than a language that works at the same level but built to modern standards, except the piles upon piles of legacy code you can use.
Enable max warning level, use a static analyzer, and ASAN, UBSAN and TSAN (in order of importance), and most problems you listed just disappear. Most importantly though: don't use MSVC if you have the choice.
Yeah if you want to kill yourself from frustrations, maybe. I'm not writing microcontroller code for the fucking space shuttle, and I would suspect most people aren't.
C did a ton of things right, but it also did a ton of things wrong. Learning from that and moving on would be the sensible thing to do after 50 years.
> Yeah if you want to kill yourself from frustrations, maybe. I'm not writing microcontroller code for the fucking space shuttle, and I would suspect most people aren't.
You're really exaggerating the problems. Does your negative opinion of C come from experience, or did you listen to the Rust evangelists who have an incentive to make the difficulty appear bigger than it is? Because it hasn't been my experience that C is this huge minefield of bugs that are impossible to explain or debug. You prevent a lot of bugs by actually understanding the language instead of coding by trial-and-error, the remaining bugs usually get caught quickly if you use an advanced compiler like GCC or Clang with the right flags (warnings and sanitizers), and for the occasional bug that slips through, the debugger tends to be helpful.
It's true that C has a bunch of historical footguns like gets and strcpy that you need to avoid. It's a very bad language to learn by trying random things and seeing what works. However, it's possible for a "mere mortal" to write good code. You just need to do more up-front learning than you could get away with in e.g. Python. If you pick a good book and listen to experienced programmers, they will tell you what to do and what to avoid.
And regarding abstraction—you can go very far with just structs and pointers, but you have to do things the C way rather than trying to write Java in C. If it's enough for Linux devs and their millions of lines of code, it will be enough for your personal microcontroller projects.
There is a very promising contender in the low level space that aims to fix some of C's problems, it's a new language called Zig. However, it's at a pretty early stage; even if it catches on, it will be many years from now. Right now, if you want to do low level work, you'll benefit from becoming good at C.
Tell me an alternative which ticks all the checkboxes and I'll switch immediately. C++ isn't it because the committee has completely lost focus since ca C++11, Rust isn't it because they completely forgot about ergonomics, simplicity and elegance on their quest to fix memory safety (and both C++ and Rust suffer from "design by committee").
Zig looks perfect so far, but it's too early to switch over yet.
Or D itself if you don't need a language as minimal as C. D is basically C++ redesigned and now that GCC includes D support by default I wonder whether it'll gain popularity.
Definitely an option, and D is actually one of the languages I haven't seriously looked into yet (or rather, I saw it as a C++ alternative in its heydays ca 2005 and that image stuck in my head - and at that time I hadn't been looking for a C++ alternative)
PS: my main use of C is currently to write platform abstraction libraries with minimal size and runtime overhead, so need to talk directly to operating system APIs, plus WASM is a very important target. The libraries must be usable from other languages via automatic bindings generation (quite simple with a C API). Also for performance-oriented stuff, direct control over memory layout and lifetimes please.
Also personal opinion from 20 years of C++ experience: high level abstractions never pay off in the long run. Simple imperative code always wins when it comes to "malleability".
Interesting choice, but Ada is probably even less popular than Zig.
Even just requiring users to integrate my hypophetical Ada library source distribution into their project's build system files would most likely drown me in support tickets ;)
> There is nothing predictable about C except that you’ll eventually get screwed …
This has been the exact opposite of my experience. I’ve been writing C for 10 years and have yet to find a piece of code where I was surprised at what it did. That’s one thing I love about C, is it is entirely predictable. If it isn’t, my code is wrong. The language is rigorously specified. It is not hard to avoid undefined behavior.
Contrast that with languages like C++ or Python which hide gotchas all over the place. In Python, one cannot even rely on a variable being a certain type, and if it isn’t, the program explodes. C++ allows plus to not be the inverse to minus, allows for hidden custom memory allocators (overloading the new operator). Template metaprogramming is borderline sorcery past the simplest of use cases. C++’s interoperability with C is an accident waiting to happen with all the reallocations which can occur without the user being aware.
C lays flat out in front of the programmer all the unpredictable behavior that many other languages implement behind the programmer’s back. Sometimes that’s not desirable, and sometimes it is.
I agree with your point about Python, which is why I'm glad type hints see adoption but dismayed that they're essentially fancy comments that don't enforce the actual runtime types.
The thing is, I'm not convinced avoiding UB is easy. E.g. what's the behavior of the following code?
Agreed on the dismay regarding type annotations. My opinion is that potentially misleading code which gives a sense of safety when none exists is worse than dangerous code. It lowers the programmer’s guards, which can lead to more bugs.
Integer overflow will result, I’m pretty sure. The largest value a signed 16 bit (so, 15 bit) can hold is 32767, IIRC.
I can see where that’s unexpected for people whose brains aren’t wired in powers of 2. This is one area where I think Rust improves upon C, with its availability of overflow detection in arithmetic. It’s unfortunately verbose, but it enables greater safety.
Not quite what I was getting at: On an implementation with 32-bit ints, the code is valid – the values get promoted to 32 bit, added and then truncated to 16 bit. Yet on a platform with 16-bit ints (and microchips & unusual platforms is a frequently stated reason for using C), the addition overflows and result in UB.
Luckily most other languages haven't decided to copy C's implicit promotion rules & target-dependant integer sizes.
Given that all arithmetic autopromotes to int if smaller than int, there's no undefined behavior in this code if int is 32-bits (which is true on most systems).
Embedded Rust has been a viable option for at least 4 years now and especially so for the past 2 years. I really dislike having to learn the quirks of building, configuring and navigating typical embedded c based projects. They always seem to have an excessive amount of tiny files (in various languages) all over the place with obscure heuristics only the original authors know about. IMO, to build anything new your only reasonable option is to blindly copy and paste an example project and hack away. I’ve never been able to “start from scratch”.
An embedded Rust project is the same as a normal Rust project except that you mark it as not linking the standard library !#[no_std] and you define a main entry point and panic behaviour (there are helper crates for this).
You can still use the core and alloc crates which give you pretty much everything you need in an embedded system like strings and vectors. You also get to use modern tooling like vs code and rust-analyser instead of a different antiquated version of Eclipse for each hardware vendor.
I don’t think that Rust should only be used for big projects. You can use it for small projects and you really don’t need to get complicated with generics for application code. You need to put in the effort to get a fundamental understanding about what the borrow checker is trying to achieve and the rest may be easier than you think.
While it seems Rust supports ARM devices like M0, M4 and of course more powerful chips like those capable of running Linux, there are huge swathes of chips that it doesn't support like 8051, PIC etc.
"Seems" is an outside perspective. There are loads of hardware features that it just doesn't support on various boards, and lots of extra hardware (like sensors) that it has no libraries for. It's not just the MCU/CPU that matters here.
> So we're gonna be stuck writing a precambrian prototype language till the end of time because there's so much legacy code already written in it?
Yes. Unless somebody steps up and rewrites everything in Rust or Lisp or whatever, that's exactly what's going to happen. Lack of backwards compatibility with existing software will condemn programming languages to irrelevance on day one.
I really hate how for microcontrollers the only two choices are either C++ or Micropython, I mean how about some fucking middle ground instead of two polar opposites? At least eventually everything will be rewritten in Rust I guess.