For those of you who enjoy a heaping meal of architectural commentary sprinkled with a generous portion of sarcasm, check out the author's blog [1]. Literally LOL every time I get a chance to read it.
I kind of hate that blog. They make fun of things for basically being slightly tacky. A lot of the criticism feels like it's just 'Stop Having Fun' Guy. (The heating bills for mcmansions are another story, but that's unrelated to the criticism.)
No, it's not worth it. Modern IDEs make so many things easier than vi / vim / emacs ever did.
vi / vim / emacs can be useful in certain situations (Unix / Cygwin command lines, low level embedded systems that don't have a desktop, etc), but for most code, the modern IDE GUIs blow them away. Code navigation / inspection / contextual browsing is much, much better, easier, and far more productive using Eclipse, CLion, Visual Studio, etc.
The goto has gotten a bad rap over the years because of Dijkstra's paper. And that paper has unduly influenced a lot of incorrect thinking. There are valid use cases for goto, and this is certainly one of them.
I use it all the time like the example above. Particularly because it makes my life so much easier when developing and debugging embedded C code across various tool chains, some of which have less functionality than others.
Just curious, not a C developer by any means, but why wouldn't you use a function here instead of a goto? I'm confused how goto would reduce error/improve readability in that example.
Simply: a goto never returns while a function call returns to where it was called from.
So specifically in the example above, if you called failure-handling functions instead of using goto's then when the function returned you would continue execution on the next line after the function call. In the example above, that's clearly not what you want.
Now you could add some else's after the function calls to prevent execution from continuing. i.e. to get to the appropriate step in the free_* sequence at the bottom, but that starts to look messy. So I have to admit (not being a goto-lover), the above example reads very nicely.
It conforms to the "gotos might be okay if they only jump forward" rule of thumb I've heard.
Unfortunately, C does get a lot of hate on HN. I suspect it has to do with this site's demographics. Many (not all) of the HN clan seem to be oriented towards / mostly familiar with web based technologies. I suspect that for many who have tried, going from a web dev environment to a C oriented dev environment feels like a robust shock to the system.
I'd also be willing to bet that there's an age bias at play here; C has been around, like, forever. It is certainly not the new hotness. Most (not all) people that I know who enjoy it and are proficient at it, are 40 or older. Much of the web based dev crowd that hang around HN seem to be in their 20s, and as it is a time honored tradition to poo-poo the ideas / methods / tech of the older generation(s), it's not surprising that C doesn't get a lot of love.
Yes, I realize I'm painting with broad strokes here. It'd be interesting to see a survey or three that correlates age ranges and tech used on a day-to-day basis to see if these assumptions or legit. (Anyone got any survey data up their sleeve they'd be willing to share?)
Me personally - I love it all. C, C++, Java, Python, Javascript, Rust, Haskell, Scheme, etc. Making computers do things for you, and for other people, by writing detailed instructions is quite possibly one of the funnest things in the world. Double bonus for getting paid to do it!
It's not just that HN does a lot of webdev. It's that even in its element as a "systems language" it's virtually impossible to write 100% safe C/C++ code and guarantee that it will remain safe into the future, even for experts who are making every effort to do it right. There are just too many gotchas with "undefined behavior" and too many clever compilers out there waiting for you to make a mistake.
One only needs to look at something like the OpenSSL library to see the problem. You really need to hammer the hell out of C code with something like AFL to get at a reasonable majority of bugs - and you could hammer out every last bug one day and then the next day a compiler starts optimizing away your safety checks. This isn't a theoretical problem, this actually happens. Code rot is a very real problem in C++, to a far more massive extent than any other language.
Personal opinion here, but with few exceptions C/C++ are inappropriate languages for starting new development at this point. I realize the tooling is not there yet but I would rather see something like Rust used in almost all performance-sensitive applications where C/C++ are currently used. Unless you can guarantee that you are operating in a trusted environment and will only ever operate on trusted data, C/C++ is just not the right language for the job.
Yes, it's fast, but at what cost? I would gladly give up a massive fraction of my performance for better security and portability - and that's why I program Java. Not that Java is perfect either, but at least I can be certain that the sands aren't shifting out underneath my programs.
I would actually say that porting the Linux kernel to Rust would be very high on my wish-list at this point. I am well aware of just how enormous that task would be and I might as well wish for a pony too, but it gives me heartburn to think of just how much C code is sitting there operating in the most untrusted of environments on the most untrusted of data. I have every faith in the kernel guys to do it right, but the reality is there is a lot of attack surface there and it's really easy to make a mistake in C/C++. It may not even be a mistake today, only when the compiler gets a little more clever.
> Yes, it's fast, but at what cost? I would gladly give up a massive fraction of my performance for better security and portability - and that's why I program Java.
While I agree with the sentiment, a problem with Java is that you're dependent on a runtime environment with a fairly consistent history of vulnerabilities, right? [0][1]
> Personal opinion here, but with few exceptions C/C++ are inappropriate languages for starting new development at this point.
Maybe, but now there's SaferCPlusPlus [2]. At least it may be a practical option for improving memory safety in many existing code bases.
I think the bottom line is that it simply takes too long to actually become fluent in 'C'. This makes it a horror for open source, where you have to draw on volunteers.
You simply can't just write 'C' without making sure all the details that are necessary to run safely are in scope at all times.
While I agree - the OpenSSL cases certainly show the weakness of the language, there's just no way I'm gonna hang all that on 'C'. Writing protocols and protocol drivers is a fairly tedious sort of skill to attain.
We inevitably descend into a counterfactual ... "fantasy" ( sorry; don't mean anything insulting by that - besides I do it too - it is just the nature of counterfactuals ) in which 'C' ends up the villain, when it was a much richer set of failures in play.
I don't think anyone can demonstrate that it is virtually impossible to write 100% safe C code. Sure, you can always find people who don't know how to write a proper safety check. That doesn't mean nobody knows. You can always find people who ignore or don't know about best practices, but that doesn't mean everyone's like them. And you can find people who write goto fail; and ignore the warnings about unreachable code posted by any half-decent compiler or static analyzer, yet there are people who will pay attention to that kind of stuff. People scream UB, UB, C is evil because of UB, but goto fail is essentially a logic bug, something you could have implemented in any language. It doesn't need UB to happen.
Yep. Have a look at the code coming from the OpenBSD crowd. Those folks really know how to wield C. It involves, first and foremost, writing readable and straightforward code, in an attempt to make any bugs obvious. The OpenBSD folks also insist on code review, which also helps.
And wrt tooling: C has some of the best tooling around of any language. GCC, Clang, and Visual C++ can all do some pretty decent static analysis, and then there are tools like lint and Frama-C, and tools like valgrind. Coverity also offers free static analysis for open-source projects. Make use of all the tools available to you. Testing is also important. Shoot for 100% code coverage (see SQLite3, for example, which has a massive test suite).
As you say, one of the requirements is to pay attention to warnings and fix them. In compiler parlance, "error" means "I can't compile this code" while "warning" means "I can compile it, but it's going to misbehave at runtime".
And here's something about undefined behavior: it's possible to know which behavior is undefined and to avoid it! Not every C program is riddled with undefined behavior.
I think that you have the formulation backwards. You claim that people can just write better, and should attain perfection.
> I don't think anyone can demonstrate that it is virtually impossible to write 100% safe C code.
I think most people come at the other way. Most people are aware that they are fallible and wants tools to help with that. Most people strive for perfection and none will ever actually attain it.
> I don't think anyone can demonstrate that it is virtually impossible to discover errors safely in C code.
There is a huge difference simply moving from C to C++ with exceptions. The type system in C++ can detect several classes of errors at compile time and prevent then grom going into the results.
Then for runtime problems if an underlying functions throws, it cannot simply be ignored. Any programmer can miss a single statement, or worse refactor a function with a void return to one that returns and error code (which then results in every caller ignoring the return value). However, it takes a special kind of malice to use something like carelessly catch(...) in C++ to disregard exceptions so that runtime errors are avoided. C++ with exceptions has more sane defaults because it fails fast and the failing itself doesn't need tests until it starts doing something meaningful.
Now imagine the advances in error detection moving to languages that catch additional classes of errors.
Lint was created for C in 1979 as the language authors saw how easy it was to make errors, static analysis is still largely ignored by the majority of C developers nowadays.
In projects with centralized build scripts, like most projects, hopefully they have -Werror or its equivalent on by default. I was speaking about the case were a group has systematically ignored warnings and they are already beyond fixing. This is a depressingly common state for many shops. The best fix I have seen to enable as many warnings as possible and treat them as errors as early in the project lifecycle as possible. For whatever reason C++ shops are much more likely to do this than C shops in my experience.
If the compiler isn't the "language" enough for you, then please explain how to write a buffer overflow in Javascript?
So I see this argument as "should the tools catch these things?". I suppose that would make some people feel better. But the fact is, when you're in the seat, it's up to you to make sure you Do No Harm.
But please be aware - generalizing all failures and integrating them into the tool suite is a pretty daunting task. Perhaps the economics of it make sense. But if you're stuck writing 'C', especially on legacy code bases with legacy tools, you're stuck, and there's only the one thing to do...
That did sum up my argument well, same one extreme you are taking.
You don't need the compiler or exception to cover all your errors. If you know something would be too costly to integrate in these mechanisms then you are free to disregard it. I have written throwaway code that did gross things with pointers, memory and system specific resources. But if I want code to last and be maintainable I do my best to get the compiler to watch my back.
This also works well when interfacing with legacy C. If the new code can be written in composable and unit testable classes, then you can prove (only to the extent of the quality of your automated tests) that problems are in your code or in the legacy code as they arise. Then when you find problems in legacy code, try to break a piece out and replace it with another class, even a a big ugly one just so you can get some unit tests in there. Then you can break the big ugly class into smaller, cleaner, composable and well tested units.
This again. :) I think there's an angle your side of the discussion is missing on this. You might, with enough experience or team talent, be able to consistently write good code in C without defects. You might be able to do that up to millions of lines of code if your project becomes a Linux. However, the vast majority of projects will involve something along these lines:
1. The team is average or below since they're affordable or the work kind of sucks. This often happens in practice even with smart coders because the deadlines force them to move too fast with too little QA. Product might still have high impact, though, esp if it's widely-used product or service. The language itself preventing common problems is helpful here.
2. It's a FOSS project made by people that want to get stuff done without learning tons of rules for working around C's issues or stopping every common operation to prevent language itself from killing their project. I'd say vast majority of projects don't need whatever absolute advantages like max performance that C has over safer languages. Again, the language could be helpful.
3. Either of the above given the effects of time where new contributions come in that work against a codebase that fewer and fewer people understand due to organic growth. The language itself can be helpful with a combo of type-safety, programming in the large support, modules, etc. Better support for safer modifications of stuff you barely understand. Rarely a problem for Ada and Eiffel people if the team was even half-competent because the compiler simply demands it.
There's embedded people that can do one-off or steady projects however they like with enough time and tooling to get it right. ArkyBeagle appears to be in a category like that if my broken memory isn't fooling me. Then, there's vast majority of programmers either in the corporate crunch, scratching an itch barely caring, or fixing something they barely understand. Human nature will push defects in from all these directions. The tooling, if designed with human nature in mind, can prevent a lot of them automatically and aid efforts to catch the rest.
Hence, my opposing C language in favor of safer-by-default system languages. Especially those that avoid tedium of constantly watching out for dangers of most-common operations. Gotta work with human nature rather than against it. A hard lesson I learned after years of failed evangelism of high-assurance INFOSEC. Now, I exclusively look for ways to embed it seemlessly into stuff with other benefits listed. Much better responses on that. :)
> I suspect that for many who have tried, going from a web dev environment to a C oriented dev environment feels like a robust shock to the system.
> I'd also be willing to bet that there's an age bias at play here; C has been around, like, forever. It is certainly not the new hotness. Most (not all) people that I know who enjoy it and are proficient at it, are 40 or older.
As someone who went the "other direction" (Java -> Ruby -> Javascript) I can say that a lot of it has to do with the accessibility of the ecosystem rather than the language itself. This could absolutely just be my filter bubble, but I've noticed that the communities surrounding Ruby, Python, and Javascript seem to go above and beyond the call of duty when it comes to making libraries easy to use, documenting those libraries, building and refining the tools, and so on.
I know there are good tools out there for C development. I know there are good learning materials. I know there are communities out there dedicated to writing good C code (Shout-out to /r/c_programming on Reddit. Love those folks.) But I can't sort out the signal from the noise, because there isn't a lot of discussion about C programming happening in the online spaces I'm familiar with. As a counterexample, there was a _fantastic_ article on here the other day about "writing your own syscall" in Linux. Yes, it contains a lot of hand-holding and overexplanation, but that's useful for me because I haven't built up the mental model to parse a more terse explanation.
In fact, I think this is how having "the new hotness" change every couple years has been helpful _in some respects_- there's an incentive for lots of people to write blog posts, tutorials, and articles about how to properly use the latest and greatest tech, there's active development going on as people forward-port functionality (and therefore plenty of opportunity for devs to make meaningful contributions and have meaningful discussion about "how to write code using this language/library/framework"). For a short period, both the "old hands" and the newbies are in the same boat, and this is unbelievably useful for training up the next generation of developers.
> Me personally - I love it all. C, C++, Java, Python, Javascript, Rust, Haskell, Scheme, etc. Making computers do things for you, and for other people, by writing detailed instructions is quite possibly one of the funnest things in the world. Double bonus for getting paid to do it!
Same here, friend. :) For what it's worth, I wish there were more of this attitude floating around the Internet.
It gets a lot of hate because the majority of developers are not embedded developers, kernel developers, or doing anything involving hardware. The other reason, IMO, is that to do anything that's actually kinda cool or fun in C you have to get pretty adept, so it's probably just written off as an old, boring language.
Personally I'm in my mid-20s and quite enjoy working in C. And for things like bit manipulation it's much easier than in higher level languages. I suspect at some point even the smallest MCUs will be able to run Rust or Go, but until that happens there is still a place for C/C++. Haters can hate but that won't change the fact that C is still the most widely supported language for embedded platforms (and Linux, the other elephant in the room).
It turns out that some DNS servers cached the old nameservers and are now failing to resolve those. It's unfortunate, but all we can do is wait for the cache to expire :(
So bummed to read this. Even though I wasn't going to directly benefit from Google Fiber, it was sure nice to have a non-entrenched player tackle this market.
From the big G's standpoint, it makes good biz sense to exit this market. I sure hope the subtext in the article comes to fruition ie that Google / Alphabet has figured out a better way to get high speed internet to homes in the US sans fiber.
In an ideal world, this fast fiber internet ought to be a municipally managed utility, with my tax dollars paying for the fiber in the ground. Then, my take home dollars paying for whatever competing service(s) I choose to light up said fiber to bring me access to the net.
Wow - that. looks. exhausting. And like an overall pretty ineffective use of engineer's time. At least for the kind of SW development that I generally do.
I can imagine certain limited scenarios where this kind of problem solving approach might be useful, but on a day-to-day basis? Nope. No way.
For those of you who enjoy a heaping meal of architectural commentary sprinkled with a generous portion of sarcasm, check out the author's blog [1]. Literally LOL every time I get a chance to read it.
[1] http://www.mcmansionhell.com/