Not trying to be snarky. Honest question: Why use C11 over C99? Or even why use C99 over C89? What significant advantages do the new standards provide that cannot be done in plain old C89?
C11 gives you noreturn and alignas. Alignas can be pretty useful for low-level development in particular. Just hope you don't need variable-length arrays because those got changed to optional.
> Or even why use C99 over C89?
Several very big things: Native bool, stdint.h (fixed-width int types with known sizes ahead of time), long long, snprintf, not having to declare all variables at the top of the block (and now you can do for (size_t i = 0; i < sizeof(strbuf); ++i) because of it).
You don't need any "significant" advantage. Even a very small advantage ("an anonymous struct would be handy here") is enough, why would you _not_ use it when it's free? For the fun of the constraint? I'm not a C expert but I don't think there's any downside to using the C11 standard compared to C99
...which is lower than the number of platforms with a C89 compiler. A lot of popular projects known for their high portability are C89 for this reason.
There's also the fact that there are far more compilers for C89, and it is easier to write one than for the newer standards. This becomes important if you are interested in avoiding Ken Thompson attacks.
Personally, I still stick to C89 and the only newer feature that I've found to be useful is mixed declarations and statements, but it's no big loss as it both avoids the "variable proliferation" that some codebases seem to be afflicted with, and blocks can be used to start a new inner scope if you really need a new set of declarations anyway.
I'm so glad to not be the only weirdo out there just sticking to plain old C89. I concur to all your reasoning. C89 is simple, readable, gets the job done.
My only pain point is indeed stdint.h. Though it's often available everywhere even if not standard per se.
They're actually called trusting trust attacks (the original paper on the topic is "Reflections on Trusting Trust" if you want a guarranteed search term); I'm not sure why userbinator used a eponym instead.
I'm also not sure why they would be relevant for a general project, since the source language being easy to write a alternate compiler for only matters for the compiler itself: once you have non-infected compiler, you can bootstrap gcc or whatever and compile everything else at whatever C standard you like.
When you start using new features you break backwards compilability. For C11 that means distros as new as Ubuntu 10.04 (which I still use as my main desktop) and the like are going to have problems compiling (GCC it ships with only supports parts, as in C1X). This will also apply to older embedded systems where a tiny client would be useful.
In the past a compiler and ecosystem would last a decade before it couldn't compile something. These days changes are coming out, and being used, every 3 years. It's future shock and the major cause of container usage on the desktop and in academia. Sticking with a well established older standard means everyone can avoid the massive increase in complexity and problems that containers bring.
>That must have horrible security implications, surely?
Lets just say it's a matter of taste. I keep my attack surfaces to a minimum, backport what I can "patch and statically compiled deps for userspace"-wise. On the otherhand, I browse the web with javascript disabled so my old box probably has less "horrible security implications" than a completely up to date distro with the user blindly executing all code they're sent. Security is behavior more than software.
Older standards typically have a larger pool of people who can contribute because the standard has been around longer.
A programmer might have more experience with an older standard due to the length of time has been out or because the toolchain they use elsewhere (personal projects, embedded comes to mind, or work) hasn't updated to the new standard.
Coming up to speed with the new standard is not free. The tooling may be free for the most common targets (embedded usually lags), but taking the time to learn isn't free.
It certainly is free. The standards are generally backwards compatible and the changes are simple. You do not even need to be aware about the differences between C89 and C11 to contribute to a C11 project.
> or because the toolchain they use elsewhere (personal projects, embedded comes to mind, or work) hasn't updated to the new standard.
I'm currently porting some "C99"-ish code, and I had a few instances for which I would have loved just use C11's `_Generic` to replace a macro-hell with statement expressions and accompanying `typeof()`s all over the place. Fun fact: `typeof` is a GNU extension, and the target compiler doesn't have that.
C99 over C89: designated initialization and compound literals are the biggies, plus all the small accumulated improvements that had been added to C during the 90's (e.g. variable declaration anywhere, for (int...), winged comments...)