Hacker News new | past | comments | ask | show | jobs | submit login

Completely wild speculation...

I think there are a number of disenfranchised "close to the metal" programmers. C has stagnated and won't likely move much, C++(++++...) has reached bonkers level of complexity, D is... D who, and Rust is highly opinionated, which is a double edge sword.




D has garbage collection (though it can be avoided, but perhaps everyone isn't aware of it, or maybe it doesn't have enough benefits then), which I think some game devs want to avoid.

Rust is complicated by its very strict memory safety. While that's nice, I don't think memory safety is game developers #1 priority. A crash is annoying but acceptable.

Zig also doesn't assume a global allocator anywhere (malloc/free), and I think game developers often like to use custom allocators for performance reasons.


Nitpick: With memory (un)safety, a crash is the optimal outcome when something goes wrong. The real problem is when it doesn't crash and the program continues on thinking everything is fine...


I'm quite amazed how resilient games are to this. Like how SMB1 for the NES is capable of things like the minus worlds where you're more or less playing levels that don't exist because the game doesn't expect you to be reading memory past a certain point, but as long as the values are valid, it can still use it to some degree.

There's more extreme situations like Super Mario Land 2 for the Game boy where it's possible for the game to present it's own memory on the screen and if you understand how it works you can do things like ask the game to execute the ending sequence [0]. At it's most extreme a series of TASes was presented in 2017 wherein multiple different hardware systems were linked together via RCEs in different games to make a VOIP streaming setup from a bunch of consoles that don't actually have internet connections [1].

I'm sure there's all kinds of reasons this kind of UB is undesirable, and could be hazardous in some environments, but people have also done some really cool stuff with it!

[0]: https://www.youtube.com/watch?v=faiZgO35YaQ&t=0s [1]: https://www.youtube.com/watch?v=7CgXvIuZR40


The resilience is because (1) those titles ran on systems with no MMUs; (2) there was no heap and no malloc implementation, therefore no way to get use-after-free. None of these points apply to modern systems, including consoles.


>"C++(++++...) has reached bonkers level of complexity"

It has and it has not. It can offer level of complexity matching that of a programmer. One does not have to start coding concepts as the introductory course to programming. Take some numbers stuff them into array, sort and print - is a simple piece of cake anyone can grasp.


C99, C11, C17, with C23 on the horizon.

The only thing stagnant on WG14 is their regard towards providing safer string, vector types and enumerations.


Would you mind mentioning any noticeable improvements in those versions? Because C seems to remain the exact same for decades now. I only know of _Generic thingy which is hardly ever used and maybe variable length arrays?


C only adding a few new features only once every few decades or so is actually a feature, not a problem (IMHO of course). C99 was the last big release, while C11 and C17 were mostly minor course corrections and spec cleanup (VLAs have been degraded to 'optional' for instance, which makes a lot of sense, because it shouldn't have gone into the standard in the first place). So far, C23 seems like a bigger release again though.


With no ill intent, could you please tell me why would anyone start a new program in C? I am seriously interested in that.

It of course has to remain with us due to the insane amount of code already written with it and some tooling (and eg for verification certain subsets of C), but it is not any closer to metal than other system programming languages, the “standard lib” is full of foot guns, it has a very weak type system, it has no way of enforcing basically anything. Also, due to the its lack of expressivity it relies on text-based macros (and those I hate with a passion, as there is hardly anything worse than text-replacing source code with no knowledge of syntactic elements).


Languages are taken way too seriously, and there are way too many bad languages out there that try to dictate the shape of your solutions in a certain way. A good language shouldn't do that in my opinion, except for boring and "solved" problems. The problem with languages is (apart from LISP maybe) that you can't abstract over their syntax - so while languages should give you useful building blocks to implement a solution, they should stay out of your way as much as possible.

To me, C is the easiest way to interface with the platforms that I develop for. It's also totally sufficient to implement the ideas that I'm working on - figuring out the data layout, the flow of code and data... It offers a concise syntax for the frequent operations - load / store, arithmetic, dereferences, and function calls. Most languages are already disqualified by not offering or discouraging nesting of data types a.k.a value types, making it way too complicated to simply copy data "anonymously", using memcpy() or similar.

Think about that - the language that people can't stop bitching about because it doesn't offer "generics" is actually the one that allows you to read and write data generically without a tremendous amount of complexity and/or slowness.

That's why my personal route is to learn ways to not need the type based polymorphism offered by more complicated languages, and to get along with just straight code that can push _any_ data payload (type) to the next endpoint - without requiring the multiplication of junk in the type system and/or in the compiled binary.

Macros aren't used that much but they would be missed a lot if they weren't there to help in some cases where we need to hack an abstraction over (lexical) syntax instead of data. I said above that you can't abstract over languages' syntax, and C macros don't really do that... They're a hack, and sometimes tremendously useful.


Good question, I wrote about my own reasons for moving back to C from C++ a while ago:

https://floooh.github.io/2018/06/02/one-year-of-c.html

...and "modern C" is often misunderstood by C++ people who only know the fork of C known as the "common C/C++ subset" (which is basically an outdated, non-standard dialect of C that's stuck in the mid-90s):

https://floooh.github.io/2019/09/27/modern-c-for-cpp-peeps.h...

However, I'm hoping to eventually abandon C in favour of Zig within the next 10 years or so :)


I've written a few CLI utilities in C over the past couple of years. Here's why I keep choosing it for tools that I want to distribute.

C is the lowest common denominator. It has good compilers that just work, with no drama or surprises. It's portable across CPUs and operating systems. It will continue to work for all Unix-like platforms long after trendier alternatives have turned to dust.

It has good code formatters and analyzers. It's an easy language to completely understand, and it's easy to read if I don't try to get to clever with macros or type tricks.

I don't have to remember whether ${language_feature} is Considered Harmful yet, like C++ or Rust where the current best practices constantly become Bad Code in favor of the next trendy feature.

The C ABI is the standard for libraries on Unix-likes, and I usually want to use a couple of special-purpose libraries. Calling libraries requires no wrappers, FFIs, or anything else.

Yes, C has limitations. Lots of them. But it also has an enduring momentum that's not going away any time soon.


C is a very simple language. I think Zig will get close to replacing it, but it also happens to be more complex, so I'm sure some people who really prefer simplicity will still use Zig.


will still use C*.


Whoops, thanks for catching that, though the edit window's passed.


>"why would anyone start a new program in C"

I wrote a firmware recently for low power microcontroller. No memory allocations in code. The language is simple and easy to read unless one is purposely trying to be fancy. It is very performant - the end result and compile steps as well. And there are megatons of C libs for microcontrollers. Would not dream of using anything else for this particular task


Which is why it would be good if WG14 was a bit more security conscious.


I am a practical man. The last thing I worried about is a security issues with my code on that specific device. Or should I say non issues since it works like a charm and I just can't see how C "insecurity" can bring any problems to my particular situation.

Sure let WG invest their efforts where needed but for my own case I do not care.


Until liability and lawsuits due to security exploits become a common thing.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: