Hacker News new | past | comments | ask | show | jobs | submit login

I am not angry, just very sad but I guarantee you most people who love to program in C or do it for a living are not "standards enthusiasts".



I am very happy about #embed, auto, constexpr, static_assert, unreachable, and ditching of K&R function parameters. Aren't you?


#embed is a great feature and should have been added ago.

Knowing how much memory your numbers take up is important for many applications, so I find things like "auto i = 5" to be questionable.

Fancy compound literals seem like a solution in search of a problem.

Removing ancient unused misfeatures is good.

I don't have strong feelings about the rest. But I think people are reacting to the process more than the specific features. There's always a good reason for new features -- that's how feature creep works. Over time, adding a few features here and a few features there is how you go from a nice, simple language like C to a gargantuan beast like C++. C has around 35 or so common keywords, almost all of which have a single meaning. C++ has many more keywords (80+?), many of which have overloaded meanings or subtle variations in behavior, or that express very fine distinctions -- const vs. mutable or static/dynamic/const/reinterpret_cast, for instance. All of this makes C++ a very large language that is difficult to learn.

In a world of constant feature additions and breaking changes in programming languages, C is largely the same language it was in 1989. For some applications, that's a good thing, and there isn't another language that fills that niche.


> Knowing how much memory your numbers take up is important for many applications, so I find things like "auto i = 5" to be questionable.

Automatic variables[0] don't take up any defined amount of memory. They certainly don't take up exactly sizeof(i) memory.

Struct members and global variables are more likely to do what you say; in that case it will be either not be allowed or will be sizeof(int). Conversely, `long i = 5` is two different sizes (long vs int) which could be a latent bug.

[0] aka local variables, not `auto` variables


auto, constexpr: Kill them with fire.

unreachable. No when the optimizer compiler can f*#k up my code if you combine it with unintended undefined behaviur. I just use asser(0) in debug builds. Not kidding.


Unreachable is a quite important optimization hint (note how the 'blub()' function removes a range check because of the unreachable in the default branch):

https://www.godbolt.org/z/Ph8PY1drc


I know it, but if you have a bug that reach the Unreachable path the compiler can't tell you anything. that is why I use assert(0) in debug builds.


And you can easily do a macro check and define a custom thing that's either assert(0) or unreachable() depending on the build type. But you still need unreachable() to exist to do that. (and under -fsanitize=undefined you get an error for unreachable() too)


And you can effectively do both:

    #ifdef NDEBUG
    #  define UNREACHABLE()   unreachable()
    #else
    #  define UNREACHABLE()   assert(0)
    #endif
That's what I have been doing for years, except with __builtin_unreachable()... and __assume(0) if I bothered to make it work under MSVC.


And why this is not the default behaviur?

I am pretty sure many users are going to think it is correctness check an not an optimization attribute.


I'd rather not have a basic feature be put behind needing to define an arbitrary NDEBUG; having to define your debugging setup around NDEBUG would not fit some things - e.g. in my case I'd end up having to always define NDEBUG, and continue with my own wrapper around things. (with <assert.h> you have the option of writing your own thing if you don't like its NDEBUG check, which is what I do; with unreachable(), if you literally cannot get its functionality other than NDEBUG, you're stuck with NDEBUG).


Because C is a "do exactly as I say" language.


It can, just turn on UBSan.


unreachable() is just the standardized form of __builtin_unreachable() (gcc/clang) and __assume(0) (MSVC).

I often have a macro called UNREACHABLE() that evaluates to assert(0) or __builtin_unreachable() depending on NDEBUG.

It improves the generated code a bit.

One trick one can use is to define ASSERT() as a wrapper around assert() or something like

    do { if (!x) unreachable(); } while (0)
This is a really nice way to tell the compiler about invariants -- and generate better code (and better warnings!).

There are no fuck ups involved. None.

constexpr is great because it reduces the need for macros. There are three features that make almost all macro use unnecessary. They are enums, inline, and constexpr. Good enough inline support has only really been available for a few years -- by "good enough", I mean "doesn't slow down CPU emulator code".

Things are really looking up for C, in my view.


What's wrong with constexpr?


Its a crippled form of code generator in an era where compile time execution is suported by many modern system languages.

But I don't like any of them. I prefer to write my own code generators.


C doesn't have that version of constexpr. In C2x, constexpr is just a way to define constants, like

  constexpr unsigned long long kMyBigNum = 0x1234123412341234ull;
Previously, you had to #define. Using enum causes problems when it doesn't fit in an int. And const doesn't mean the right thing:

  const int kArraySize = 5;

  void MyFunction(void) {
    int array[kArraySize]; // NO!
  }
The above function will work if you have VLAs enabled, or if your compiler specifically allows for it. It's nice to have a standardized version that works everywhere (VLAs don't work everywhere).


Constexpr in C is essentially what const should have been, it's not as "powerful" as the C++ version.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: