Hacker News new | past | comments | ask | show | jobs | submit login
Why aren't there C conferences? (2018) (nullprogram.com)
153 points by optimalsolver on Aug 26, 2022 | hide | past | favorite | 189 comments



Can someone start one? I'd love to go.

I used c++, golang, python, javascript over the years, even tried Rust briefly. Turns out I'm most productive in C, I can get stuff done the fastest way in C and one month later I can still understand the code.

Posix C with all its libraries can do so many wonders, many of the traps and pitfalls are well known, fuzzing test can further secure the code, and, it's just that unbeatable-ly fast.

Adding a little practical OOD into C if your code base is large, even with dtor|ctor|RAIIs(a bit tricky, but manageable). I call this my C+ style.


> just that unbeatable-ly fast

Counterpoints:

1. 0-terminated C strings are slow. There's the constant need to scan the string to determine its length. Taking substrings requires making a copy. Yes, you can manually do length delineated strings in C, but there's no support for it in the language, it's error-prone, and doesn't interface with anybody else's C code.

2. One thing I've noticed over decades of using C is that it's very brittle. What that means is, once an algorithm is selected, it is never changed because it's too hard to refactor the code. (For example, changing a reference type to a value type means changing all the -> operators to .) This means C programs tend to get stuck in a local optimum of inefficient data structures and algorithms.

3. No array bounds checking, leading to a lot of time lost debugging the #1 programming bug in shipped C programs.

If you're doing RAII in C, you're more than ready to move to a more powerful language.


All good points, which are part of the reasons I used other languages, but I'm back to C after weighing in those pros and cons.

There is no perfect language, C just seems the best for me to get job done.


As long as one understands the tradeoffs, no problem.


There is nothing stopping you from storing size of the string. See: UNICODE_STRING

This is a silly argument.


Null terminated string are never meant to use for large buffers. For all small strings that just locally it serves its purpose nicely. C designer realized that there is no way to have a standard way to deal with large performance-critical buffers , so they choose not to do it. Which is exactly the reason why C is so powerful - not necessarily fast but never get in your way if you want to.

For array bounds checking, if you want it, just a write a structure with buffer length and a getter setter and it is done. People complain this write zillions of getter setters in other languages but just choose to complain in case for C.


> For array bounds checking, if you want it, just a write a structure with buffer length and a getter setter and it is done.

It's so easy, yet buffer overflows remain the #1 problem in shipped C code.

> For all small strings that just locally it serves its purpose nicely.

Not really. Whenever I review other peoples' C code, I look at their use of strlen/strncpy/strxxx functions. They're a rich source of bugs, and I'll usually find one in it (usually an off-by-one error). They don't have to be large strings, either, to be slow.


It just means not that many people give a damn about buffer overflows anyway. And put that situation to language’s fault just doesn’t make sense.

You are saying C string is slow, I am telling you short local strings are not slow, please tell me why it is slow in that case?


OK you've done a bunch of real-world projects. Here's what I never understood. Why didn't some pseudo-standard for strings develop where they created a string-like type that remained null terminated but also included a big int at the beginning with a length value?


Bigint? How often are we dealing with strings over 4GiB?


It would sure open up possibilities. Besides, I was like to plan ahead ;)


I was under the impression that C strings were inherently very fast because of how incredibly cache-friendly they are. You should not ever have to count all the bytes (memoize that!) but if you do, this is the best format there is for it.


What makes you say that? Besides C strings (pointer to characters, null-terminated) we have Pascal strings (pointer to length, followed by the characters), and then there is the variant that I'm not sure what it's called, that consists of a pair of (pointer, length). The last variant is my favorite, an is much nicer to use: you can take substrings without making a copy, you never have to iterate a string to find out its length, and your strings can contain the NULL character (which happens to be a perfectly normal unicode character).


VAX/VMS called them "descriptors", which had IIRC 16 bits of flags, a 16-bit length, and a 32-bit pointer -- descriptors could be used for different object types, not just strings. The 64k length limit and 32-bit address were not a big issue back in 1982 when I first used them.

I used (Wirth's standard) Pascal in our compiler class in college. String handling was actually pleasant in Fortran-77 on VMS, so I figured pure Pascal had the worst string handling of any language -- until I started programming in C. I finally made peace with C strings, but a couple of decades later, Forth said to me, "Here, hold my beer." I wrote some Forth-word equivalents of some of the C Library string functions to make my life a little easier! :)


I'm not sure what it's called

In D we call them dynamic arrays. They could be called length-delineated strings.


It seems that Wikipedia refers to them as "Strings as records"[1], which I suppose fits quite well. I suppose that C strings could be considered "the most" cache friendly since it has only 1 byte overhead instead of however many bytes would be required for the length (probably 4 or 8 bytes on most modern systems). That said, I would be extremely surprised for this difference of a few bytes per string to have a measurable performance improvement in anything but an unrealistic microbenchmark involving many tiny strings.

1: https://en.wikipedia.org/wiki/String_(computer_science)#Stri...


0 terminated strings make sense for extremely tight memory cases, like 64Kb computers, and CPUs that don't cache memory. But they're a clear loser for 32 bit machines.


> 0-terminated C strings are slow. There's the constant need to scan the string to determine its length.

Just record it once


Record it where? Hence lie the bugs as C doesn't offer a reliable way to associate the two.

I made a proposal to fix this in C, but it went nowhere:

https://www.digitalmars.com/articles/C-biggest-mistake.html


Not trying to be a troll, but is this benefit anything more than having to carry two variables? As in, is this about getting the length without having to use str Len even once?


If it's part of the type, then the compiler can automatically insert bounds checks. If not, then you'll have to insert the checks manually, which we both know won't happen.

Another trouble with two variables is there's no obvious connection between them. One can be modified without the other, etc.


Obviously, a struct is one option. The problem is that you then need to make a whole string library. Thankfully, many exist, so this is a non issue, unless there's a requirement that you write all the code yourself.


A struct won't be recognized by the language in order to automatically insert array bounds checks.

> Thankfully, many exist, so this is a non issue

Many string libraries that are incompatible with each other. This is a huge issue. (I myself made many C string libraries. It's not so easy. Try it.)

The language extension I proposed for C is the same one D uses. D has had it for 20+ years, and it has proven very, very satisfactory.


> A struct won't be recognized by the language in order to automatically insert array bounds checks.

If you want automatic bounds checks, then C is not the language for you.

> This is a huge issue.

Could you give an example where it's a huge problem? I'm probably limited by my experience. All of the codebases I've worked on used a single string library. When passing externally/to other libraries, boring C interfaces were used, then those libraries do what they wish from there. The string libraries I've used were mostly, deep down, just structs, with a length member, char pointer member, and encoding stuffs. Passing to the other library almost always ended up just being those member values being passed as arguments to a function, which were copied to the nearly identical structs of the other string library.


Sure. The C Standard library relies on 0 terminated strings. The Linux API relies on 0 terminated strings. Every C library I've every used relied on 0 terminated strings for its interface.

So you use a translation layer. Sorry, I just don't like them, but if you're fine putting these on all your interfaces with other libraries, well, what can I say? :-)

> If you want automatic bounds checks, then C is not the language for you.

I would reframe that as: "if you're ok with buffer overflow malware injection, then C is the language for you!" Nobody has yet figured out how to stop that.

The sad thing is it's so fixable with just a minor, compatible change to C.


> I would reframe that as: "if you're ok with buffer overflow malware injection, then C is the language for you!" Nobody has yet figured out how to stop that.

Write manual bound checks and good code in general? Granted you won't be able to catch every vulnerability, but at some point other vectors are so much easier to exploit that you won't have to worry about these anymore.


Oh, of course! Just write good code! Well kiss my grits. I can't believe the solution has been sitting here staring me in the face all this time.

I promise I'll write good code from now on. Scout's honor.


Even John Carmack thinks "writing good code" is impossible: https://youtu.be/I845O57ZSy4?t=1351 ; p


> Write manual bound checks and good code in general?

40 years of C buffer overflows argues that doesn't work.


C is designed with a computing model in mind (the way memory works, the way arithmetic operations work, the way addressing works etc.). And its language idioms are designed to fall strictly in line with that computing model.

There's very little difference in C between a character string, an array of Bytes, or even a struct of appropriate size. Other than the types and other user friendly (relative to assembly) features that C adds. This is deliberate; yep, it makes working with text harder (and maybe slower) than is necessary, but C doesn't assume that text is something you ever need to represent in your programs by default. So if you need to work with a lot of text, you should find a library for it, or write a library for it.

C isn't designed to be fast (though it often is). It's not designed to be safe (though it also often is). It's designed to be extremely precise, like a hardware description, or a good maths paper.

Some people like that precision. More often, people need that level of precision (see Linus' comments about C over C++ in the Linux kernel).


> Linus' comments about C over C++ in the Linux kernel

For someone of his computing calibre, I found his take on C++ surprisingly... immature. He never truly justifies exactly why he feels C++ is bad in his rants[0][1]; furthermore, he resorts to rather poor logic, which boils down to essentially C++ is bad because the programmers using it are bad.

Perhaps when he did attempt to use it, C++ was equally as immature, and the tooling and compilers were sub-optimal, and there was no RAII, nor std::algorithm, nor any of the niceties available with C++11 (it got better still with C++20's concepts, ranges, std::format, modules, etc).

I daresay that C++ is equally as precise (honestly, I'm not quite sure how to parse this in context) and perhaps even fills in the gaps where the the programmer's intention and their code diverge.

[0]: https://lwn.net/Articles/249460/

[1]: https://developers.slashdot.org/story/21/04/17/009241/linus-...

For the record, I attempted Rust. While it looks nice, I am personally more partial to the by-default freedoms that C++ gives, together with judicious use of sanitisers. If not a systems language, then I default to .NET.


RAII was in C++ around 1985 or so.


> There's very little difference in C between a character string, an array of Bytes, or even a struct of appropriate size.

This is really bad!

> It's designed to be extremely precise, like a hardware description.

This is absolute nonsense. There's far too much undefined behavior in C for it to be described as precise with a straight face.

More generally, C's lack of memory safety has been directly responsible for innumerable vulnerabilities and trillions of dollars in costs to society. It is unacceptable and irresponsible to start a new project that's meant for public use in C in 2022.


Hardware description languages also have tons of undefined behaviour. Electronics, in general, has physical conditions it can be in that will not resolve to a finite value in a finite amount of time.


linux,openbsd,systemd etc are all in c,c looks ok to me today.


Indeed security researchers appreciate they keeping food at their table, see CVE database.


All of them have had massive security vulnerabilities, and almost certainly (at least Linux and systemd) have many publicly undiscovered ones that are currently being exploited by state actors today.


The C Standard library bakes in the use of 0-termination for strings. There's nothing about 0 terminated strings that is the way CPUs work.


C defines an abstract computing model. It's not that CPU's work with 0 terminated strings. It's that the compiler will always ensure that 0 terminated strings make sense on any arbitrary architecture.

If you're a hardware designer building a microcontroller for a gas heater, you can just target the 'C' model, and you can have a high degree of confidence that anything defined by C will work.

This isn't the case for say Python, since lots of behaviour is defined abstractly in terms of API's and program behaviour (what we want the programmer to worry about), not in terms of memory allocation, and register widths etc. That a hardware designer can design to.


An additional aspect of C that I really appreciate is that if something can be conceptually realized, it can be implemented in C. This is not the case with many other languages where you are often constrained by them - either accidentally or intentionally, for safety or other reasons. In most cases you will end up having to write a C shim to do the deed and make available libraries to the other language. C toys the rescue, again!


D and its runtime are 100% implemented in D. D could implement the C Standard library as well, but there's no point to that, since it exists and is what it is.


Can you explain what you mean with an example? Sorry I’m a bot tier swe since I mostly code in node and go


Been a while, but an example would be the ability to work with rawsockets. You can't use this in most higher-than-C languages as this was not, perhaps wisely, considered useful for most people and hence there is no mechanism provided.


I feel the same. I'm tired of languages, can't even muster the energy anymore to give Rust a serious try - although I'm considering I'll have to do it in the future anyway, given that the hype hasn't ebbed away.

To me, C is almost entirely a non-language. It's a culture where you focus on what you want the computer to do (and optimize that), instead of optimizing what you write (and ending up with something convoluted that will be hard to read in 1 month).

I refuse to do things that many don't even give a second thought - such as, do I really need to implement the Into<T> trait and use x.into() to do a conversion? Isn't it better if I just write the function that I'm using explicitly? That way it will be easier to see what function is used, and the likelyhood that I'll have to change that code later I deem comparatively small.

C has its flaws but I know them quite well by know and have learned to walk around them. And it has some "flaws" that are misunderstood strengths to a degree.

More and more "humble" languages have been started in the recent years, but there is little incentive for me to try and switch. And even of those, almost all add some clever things that I'm nervous might be _too_ clever.


I've worked on porting a program written in C to a memory-safe language.

On the very first run of the port, immediately, I've found a bug due to C weak typing. After a few days, I've found a few buffer overruns.

My take is that there are two types of programmers: those who admit they can't avoid making memory safety mistakes, and real programmers, who don't make such mistakes, and happily keep programming in C/++.

EDIT: Just for fun, I've randomly picked up another C program, written in C17/C18. Immediately found a bunch of problems (not sure if there is any impactful, or not), including inconsistent function prototypes.


Memory safety issues are not always an issue. What I mean is that having memory safety is not everything, since a system that is memory safe can still contain bugs, and vice versa a system with memory related issue can function perfectly for years, because they are corner cases that are difficult to get into.

Of course there are cases where memory safety issue cause security problems, but not in all cases. If for example by a very uncommon sequence of keypress on my washing machine I cause a buffer overflow, worse case scenario the microcontroller hangs and I have to unplug it and plug it back in again (but hopefully there is a watchdog that resets the processor automatically). A lot of C programs don't even have an user interface, because for example are embedded in device that has no external input (for example a microcontroller that manages the operation of a power supply).


Even ignoring memory safety, C allows a lot of footgunning. I've found another source of grief to be weak typing and operators priority. I was very surprised to find bugs in the Linux kernel examples.


> My take is that there are two types of programmers: those who admit they can't avoid making memory safety mistakes, and real programmers, who don't make such mistakes, and happily keep programming in C/++.

I absolutely agree. There's a class of programmers that would rarely if ever make such mistakes (e.g., Torvalds), and to them, C is freeing.

For the rest of us lesser programmers, the handrails of a borrow checker are necessary.



I'm pretty sure that was sarcasm.


Mind sharing what is the project that you picked?


>do I really need to implement the Into<T> trait and use x.into() to do a conversion? Isn't it better if I just write the function that I'm using explicitly?

Firstly, you would implement From rather than Into. Into is blanket-derived for From. So if B implements From<A>, then A also implements Into<B>.

And no, you don't have to do this. If the function is never going to be generic, there's no sense in writing it that way. Solve the problem at hand, you can always go back and add abstraction later.

I really like Rust. I hear people say that it's hard, but I don't think it's harder than the problems it solves.


> if B implements From<A>, then A also implements Into<B>

I think I’m misunderstanding, because I don’t see how this is possible given that you could be losing information in the conversion. Is this only for isomorphic types?


A => B is the same as B <= A. Notice the change in direction.


Oh duh, thanks :)


> can't even muster the energy anymore to give Rust a serious try - although I'm considering I'll have to do it in the future anyway, given that the hype hasn't ebbed away.

I'm in the same boat. I've been in this industry for a long time and generally good at spotting hype. I try to avoid the endless wheel reinvention that goes on.

That being said... I hate to be the Rust evangelism strike force, but give it a try. The hype is not ebbing away because there's substance there.

IMHO it's the first real alternative to C and C++ that brings a beneficial paradigm shift without sacrificing performance or the ability to code close to the metal. You can write systems code that is provably safe in terms of catastrophic memory errors and is orders of magnitude less likely to have threading bugs. (It's still possible to leak memory or have a deadlock, but it's harder to do and easier to diagnose. More importantly neither of these errors are likely to lead to catastrophic security vulnerabilities.)

As with C there is definitely a learning curve. New C programmers get crashes all over the place until they get it. New Rust programmers get beat up by the borrow checker and other type system stuff until they get it. Luckily they've put a massive amount of work into making the compiler's errors comprehensible.

After getting good with Rust I am now more productive in it than C and C++. It's the first attempt at a C replacement I can say that about, and I've tried a few. The only other languages that are more productive than C are higher level languages with fat runtimes not "close to the metal" languages.

Edit: the part of Rust that garners the most complaints is async, and I'm still a bit on the fence about it. It's usable but needs work in the standard library to solve the "async runtime lock-in" and dependency hell problems. Also needs some better libraries in general. Most of the issues with async are in the libraries (or lack thereof) not the language. It was a mistake not to have the async runtime in "std."

But honestly the fact that you can do async this way safely in a bare metal language is impressive, and the only way to do that much better is fibers (a.k.a. coroutines, go-routines) and that generally requires a fat runtime. Go just compiles a fat runtime into all your binaries to get goroutines.


> I feel the same. I'm tired of languages, can't even muster the energy anymore to give Rust a serious try - although I'm considering I'll have to do it in the future anyway, given that the hype hasn't ebbed away.

At this point, it probably won't die out. The time to die was in the beginning, when they set to do overly ambitious things that had a high chance of not working. People have been wanting what it offers for quite a long time.


Most of what it offers was already available on Ada and Modula languages, but apparently we needed some fresh air to gather the attention of newer generations.


I want to work on boring languages that let me focus on domains, not tools, and even so I work with Rust and have done in my last 3 or 4 roles. If you do systems programming in a startup, Rust will be there.


Out of curiosity: if Rust isn't boring for you currently, what would it take for Rust to become boring?


I'd rather lose an hour to dealing with the borrow checker and type safe design than spend weeks debugging an intermittent segfault.


Me too (of course), but the last time I had a segfault that took me longer to fix than at most a couple minutes must have been years ago. The last time I got a segfault at all is probably months ago.

It's a matter of the more global design - factoring out the nitty-gritty things in some central places removes many bugs without you having to think about it. A lot of "usage" code then looks very simple, almost python-like. Variables and function calls. A few ifs and elses, a little bit of arithmetic.

Look at the Linux kernel for example. It is superficially not pretty, but you have to walk around quite a bit for example to see explicit locks and unlocks. It's a matter of factoring out the hard stuff in central places. This is a skill that is totally unrelated to the language you're using.

Yes, bugs happen to everyone, and more so in C than some other languages. Yes, there are security problems that come from lack of memory safety. But simply in terms of productivity, I feel that C is a very good tradeoff for me.


I always, always programmed defensively and this happened to me very seldom during my decades as a C programmer.


Language fatigue and settling on C seems to be a thing, as I've encountered several people with the same opinion. I wonder what the physiology is behind it.

For me, C is the lingua franca. With C I can read kernel source, I can read systemd source, I can read firmware, boot loader, etc. I want to specialize in the language that gets me the most bang for my buck. Rust is a compelling future, but it's still the future and not the present, and the present doesn't seem to be going anywhere fast.


I tried Rust very briefly, a couple times building some random projects. Each would download and build around 500 (!) dependencies. A few of them failed to build on my machine.

Another time I decided to try and get OpenGL running on Win32 with Rust in an evening. I failed to find a satisfying way (free of boilerplate and magic incantations) to do it. Probably interfacing with the system isn't that easy and/or you're supposed to use specific wrapper crates. Don't remember the details anymore, but it's definitely true that existing infrastructure has a lot of inertia. What Zig is doing in that space is a smart move - it has a C compiler built in, and if I understand correctly it lets you interface with C system headers pretty seamlessly.


> I refuse to do things that many don't even give a second thought - such as, do I really need to implement the Into<T> trait and use x.into() to do a conversion? Isn't it better if I just write the function that I'm using explicitly?

You crazy bastard. I can see you're not going to get promoted by power-hungry bosses anytime soon! Seriously, I love C. As a webapps guy I picked up Go for my latest projects and have loved it. Feels like the C compilers of yore, that ran fast and had straightforward semantics, even in the tooling.


Even compared to Golang where you can download a bunch of modules without writing it yourself?


so far finding well-established c 'modules'(libraries, e.g. glibc, musl, openssl, curl,etc) has not been a problem for me, there are probably more good quality c 'modules' than other languages, they're just not put together with some package manager like go|rust provide.

conan(or even the light weight tool called clib) could be of help but I have not tried them, as I don't feel the need yet.


C forces you to either reinvent the wheel or use third part libraries. While that’s fun and all, I wouldn’t say it makes you more productive. How is that any better than using C++ with its STL?


The best libraries win out where with C++ you have a huge stdlib but are stuck with it


How does one handle strings in a cross platform way in C ?


I'd argue though not directly C conferences there _are_ definitely conferences where C developers come together.

* https://lpc.events/ * https://fosdem.org/ * https://all-systems-go.io/


There’s plenty of conferences for embedded developers as well. I would argue that having a conference around a job function makes a lot more sense than having a conference around a tool, which is what a programming language is.


That’s what the article says too.


> The closest thing we have to a C conference every year is CppCon. A lot of CppCon isn’t really just about C++, and the subjects of many of the talks are easily applied to C, since C++ builds so much upon C. In a sense, a subset of CppCon could be considered a C conference. That’s what I’m looking for when I watch the CppCon presentations each year on YouTube.

that's right. C as a community got subsumed into C++ in the 80s/90s/00s.

same happened with The C Users Journal: https://en.wikipedia.org/wiki/C/C%2B%2B_Users_Journal


They call them security conferences nowadays.


There is no money in a C/C++ Conference. Conferences are held so that the organizers can make a profit from tickets and booth fees.


There are plenty of C++ conferences though.


Not necessarily, companies sometimes organize a conference at break-even (or at a loss) with motivation to build a name, attract clients, for hiring, etc.


C++ at least has their flagship CppCon no?



Yes, C++ has CppCon, so that conference is making money for the organizers.


CppCon is organized by the Standard C++ Foundation (and a ton of volunteers), which is a non-profit organization. When I was involved quite a few years ago, I believe they were making a modest "profit" on the conference which I am pretty sure was all wiped out during COVID due to the hotel commitments they enter into.


While the title may say otherwise, the article is worth a read just for the talks recommandation.


> Second, C has a tendency to be conservative, changing and growing very slowly. This is a feature.

That's it. I can pick up K&R and still be writing useful programs in a very short time.

Of course, in a lot of shops there are libraries that you have to use other than the standard one. I can imagine a conference about those. A boring conference.


Excited programmers who want to go to conferences don't care about C anymore, IMO


I'd love to go to a C conference, not gonna lie. But I've been a graybeard ever since I was a child.


lBorn with a manual in one hand and a PDP11 in the other.


You kids with your shiny PDP-11s.

Me, I had to start out with a machine that didn't even have load or store instructions (CDC 6600).


Out of curiosity, I checked wikipedia which says "In the 6600, loading the value from memory would require one instruction, and adding it would require a second one" so I'm curious what the distinction is if it's not a load instruction


The address registers were paired up with the data registers, so modifying A1 would cause an implicit load into X1, for example. A6 and A7 modifications caused implicit stores from X6/X7. So with careful register assignment and some offset initialization of address registers beforehand, array processing loops could be encoded in remarkably few instructions.


Couldn't get to sleep without hugging my K&R.


I don't know, PDP11's are pretty heavy. That'd be one flat hand.


There's always the LSI11 for those of us with .. compressible appendages.


Or the converse: People who care about C don't want to go chit chat about it every year.


Some of us fall under that category. But I guess we're not that many.


thats a very measured response, im sure you're right and thats not just a really biased view /s


Related:

Why Aren't There C Conferences? - https://news.ycombinator.com/item?id=18504879 - Nov 2018 (372 comments)


No interesting in paying for pointers.


TBH I always wonder why do we need conferences at all? Anyone who wants to share something can always pull up a youtube video. If a committee wants to discuss something they can always pull up a Zoom meeting. I'm not sure why they exist at all except for people who want to travel and sell things/ideas.

But since I have never attended any conference, I must missing something important here.


The valuable thing about conferences is the “hallway” chats and networking opportunities. If you just want to listen to talks then you can pretty much get that online, but running into other like minded people and grabbing a beer together is a fun experience.


I've been to plenty of conferences, and I agree with you. A talk seems a poor way to consume information compared to a blog or a video. Combine that with seeing multiple talks on the same day, you basically retain nothing.

The best you can do is note which talks seemed interesting and then do more research on the topic later on.

People say they're good for networking, but I find it has the same problem: you end up meeting a firehose of people for moments at a time, and it becomes near impossible to remember which to follow up.


Networking and selling yourself as consultant, book writer, library author,....


All these talk about C

* https://startupstash.com/c-cpp-conferences/

Always felt like the ACCU, was the closest to a C Conference, something that, as many mentioned here, does not really exist.


I don't think I saw a single C talk at the few years I've been to the ACCU conference.

I think there's just rarely anything new to say about C since it's so old and stable.


"Modern C and What We Can Learn From It - Luca Sas [ ACCU 2021 ]" - https://youtu.be/QpAhX-gsHMs

"Linux User/Kernel ABI: the realities of how C and C++ programs really talk to the OS - Greg Law" - https://youtu.be/4CdmGxc5BpU


There will be ISO/IEC JTC 1/SC 22/WG 14 meetings as well as those for national mirror committees like ANSI X3J11, there isn't much point in having a conference outside this standardization process.


Lots of interesting links to talks related to C at the end of the articles!


Even if I think the C syntax is already too rich, in the end, it is absurdely simple compared to other abysmaly complex syntaxes, like c++... yeah, probably not worth the conference.


If you look at the talk schedule for something like PyCon[0], not much of it is about updates to the language syntax and things of that nature. Most of the time is discussing applications of the language, packages, et cetera.

[0] https://us.pycon.org/2022/schedule/talks/


People using C don't have a time for conferences because they have a work to do.


I can't decide if this is a statement or a joke. I agree with both.


Undefined behaviour detected.


Since it is UB, I'm going to rewrite the parent comment to say "I like cabbage!".


Compiler says, "Approved!"


> Second, C has a tendency to be conservative, changing and growing very slowly. This is a feature, and one that is often undervalued by developers.

I really appreciate languages that do change slowly. One of the languages I really like is Kotlin, but it has the problem that there are new features every few months. This is a distraction most of the time and it leads to inconsistent code bases.


C programmers are anti-social. :)


Because it would take years to prepare for but would be run and completed in seconds?


The Embedded Systems conference used to be a C conference to some extend.


For the same reason there aren't any oxygen conferences.


Give it a while.


Probably the closest there is:

https://accu.org/conf-main/main/


I'm not the one to give it, but I'd like to see an answer based on pure economics, because I think that's what drives most conferences.


I haven't been to a conference in 10 years now. Online talks seem much more convenient. Are in person conferences back after covid?


Coming out of the third work meeting since 7AM I am starting to think: Yes, we need more of those! The more the merrier!


> Why aren't there C conferences?

seems a question that could give birth to Chuck Norris style jokes.


There are conferences for products written in C.


C has competions instead. IOCCC rocks!


There are no low hanging fruits to talk about in such a mature and well made language.


I think C puts the C in Cthulhu.

The C spec is a half generation behind the Common LISP spec which set the standard for how you specify languages like Java and Python. The K&R book is poorly organized and the language contains various mistakes, such as the way the parser needs access to the symbol table that deform the C++ language today.

It was minimal, it was viable, and it was in the right place at the right time so it was available on old microcomputers, 8-bit micros, MS DOS, 32/64-bit, web assembly. It competes and wins against assembly code on the AVR-8 (where it boggles my mind how many cycles C wastes following calling conventions in my simple Arduino programs) because I can compile a C program for a much better performing board.

So it is with us more than FORTRAN, PASCAL, COBOL, assembler, etc.


In case you're serious: A discussion about strncpy/strlcpy.. on LWN: https://lwn.net/SubscriberLink/905777/a6dba1b2ed54f04f/


Aside from the preprocessor and the C std library ... but they're the obvious less well made bits of fruit.


What a marvel of a well made language it is, with all its well made security holes and its exquisite null corruptions, truly a thing of beauty.


imperfection is the key to perfection


For the same reason that carpenters do not have conferences about saws and hammers. It's an old tool with well-known deficiencies and it doesn't attract an idealistic, evangelical crowd that can write manifestos and narratives about it.


We absolutely do have those conferences! I've been to dozens.

Lots of idealists and evangelicals in the woodworking world as well, probably more so than any other world I've been in. Makes technology look fairly tame.


You've been to dozens of conferences about hammers?


They have hand held precision cnc machines that use computer vision to register to the work piece and compensate for human error and execute the file.

I can't wait for people to start asking stable diffusion or dall-e for ideas. "unusual danish modern canopy bed 85mm"

I'm sure they won't run out of advances to talk about. And people keep reviving old techniques that got ignored too, or ones from other cultures.


A conference about hammers is closer to a conference on arrays. But a conference on carpentry is closer to a conference on novel algorithms and memory structures.


But we're talking about neither. We're talking about a conference on C.


Seems like this particular branch is talking about the suitability of using woodworking for the analogy.


And this is why I love HN so much haha


They're about to release Hammer 2, it's gonna be a subscription.


Did you mean "an AI powered cloud based subscription in Rust?"

OK, maybe Rust isn't such a good idea here.


We have so many handles. Wood, metal, plastic. A variety of heads! Sand-filled (dead-blow), brass, plastic, hardened metal, soft metal.

A subscription to deal with the wear-and-tear is probably only needed for the larger shops.


Never saw handtools confs but surely the mechanical woodwork confs have surprisingly large and numerous amount of new tools to show.

For hand tools I mostly saw japanese woodworker competitions.


Can you share more details? This is a world I know nothing about.


Sure, here's one of the larger ones: https://handworks.co/

There are many which are less modernly advertised (usually paper/newsgroup/email) that will draw 100+ people easily.

Including planned events with participants in the dozens, Id say you could count at least 100 a year. I'd consider that since that's the population of smaller tech conferences.


> Sure, here's one of the larger ones: https://handworks.co/

Not to rain on anyone's parade, but that event is about hammers just as much as embedded programming is about C.


> I've been to dozens.

Something like 'The 393rd International Conference on Hammers'? With people presenting about the latest developments in hammers and how they're using hammers in new ways? You've been to dozens of those?

Not sure I believe you.


The OP said saws and hammers, and there are conferences on hand tools which largely focus on saws, hammers and hand planes.

If we want to be literal then I don't know of anything on _just_ saws and hammers, but in the quite adjacent space (one or two hand carpentry tools also a focus) - yes.


Well yeah we do have general conferences on programming languages. But the point is we don't have a specific conference on C, like you wouldn't have a specific conference on a simple hammer.


Comparing programming languages to a specific tool is not the right analogy. A programmer can specialize in C, no carpenter specializes in hammering.


But that's the insight people are bringing up in this thread - nobody specialises in C.

Lots of people learn and use C as an every day tool. I'm a Java and Ruby programmer - but I have to work with C as an ABI and an extension language. Python programmers have to work with C as an extension language. There are DB developers who use C.

I don't know anyone who describes themselves as a 'C programmer' like you would 'Ruby programmer'.


John Henry specialized in hammering... He was a steel driving man!


But tech also has conferences on specific languages. Seems like a JS, C#, C++, Java, python, etc. conference could be analogous to hand tools for carpentry.


Exactly - and there isn’t one for C. That’s the whole point of the thread.


There's definitely manifestos and narratives about it, but it's lumped under manifestos / narratives about "simplicity" and "the unix way". As for evangelicalism, you're probably right about its absence.


https://www.ulanetwork.com/calendar/association-of-wall-ceil...

Carpenters do have conferences. Even groups of carpenters as small as those in NY apparently!


Conferences about carpentry in general != conferences solely about hammers.

There are plenty of systems programming conferences where the majority of work being presented is written in C. That is very different from a conference about C itself.


I'm SO enjoying all the comments about "yes, there are saw and hammer conferences." I'll have to ask my HVAC technician neighbor if he's ever been to one.


I doubt it is a good idea to work on an HVAC with a hammer.


Another pedantipoint for you. Save 'em, collect 'em, trade 'em.


I’d assume the average practitioner who attends systems conferences uses C as one of their primary languages. It seems like it would be redundant to have a language specific conference when it would overlap so much in form and audience with the systems oriented ones


Even old well known tools can be optimized and reimagined; just take the Plumbus and it’s recently upgraded Plumbus X!

https://youtu.be/JGaBU5cKluU


Saws and hammers probably aren't the best example. We complain about our trusty old tools far more than I'd imagine most carpenters complain about an old chisel for example.


Well, they both have vices, and offer plenty of opportunities to leak memory or blood, so I guess there's that.


wow...the first comment from the last time this was posted is similar to this. He even mentioned hammer.

https://news.ycombinator.com/item?id=18505081

It seems like even HN comments could be estimated by an AI.


I remember back in 2012 or so someone made a really clever karma-farming bot on Reddit. Here’s how I (foggily) remember it working:

- It subscribed to subreddits like /r/pics and /r/funny that largely consisted of posting links (not text posts)

- When it calculated that a post was rocketing toward the front page, it would look at all the past times the URL had been submitted.

- If none of those had made it to the front page, it would find the most upvoted top-level comment, copy the text verbatim, and make the same comment on the post that would end up on the front page.

For the longest time, everyone just thought that this account was some super-interesting, super-funny person who always had the perfect joke or perfect comment for any given situation. Sure it was a bit odd that they never replied to anyone, but that also just felt like part of their mystique.

Then someone got the receipts and outed the account as a bot and the show was all over. I wouldn’t be shocked to see someone on HN do the same thing, but I also think HN isn’t big enough for a grift like that to pass by unnoticed.


There was a blog post that reached first page here on HN a few months ago. IIRC, only one commentator alluded that "the post is so nonsensical as if it was written by GPT-3, no solid arguments just mishmash of phrases"...surprise, it was written by GPT-3.

(BTW, I'm not suggesting anyone here is a bot, obviously)


I used GPT-3 to generate responses to someone who argued a post is GPT-3 generated on an old account. It was pretty funny - it fooled them, but smaller text is usually harder to detect as bot-generated.


Not just HN comments.

We are all sets of the same memes, recurring over and over.

(A cool and weird experiment: try playing several shows all at the same time. There'll be eerie moments when the audio syncs up, or one show surreally reacts to another.)


That just seems like the birthday paradox rather than anything truly eerie: as you add more shows, the odds that two won’t have something going on at the same second decreases.

(Also, ad breaks represent somewhat standard points at which shows would break up programming.)


I'd be very surprised if there wasn't at least one GPT bot practicing in the comments through a series of accounts being trained on upvotes.


The HN NPC.


Been noticing a trend where conferences have gone highly woke. Last one I attended had a young woman was berating all the men and demanding we apologize for things others have done. Complaining about it got one person banned.

DHH Being banned by Ruby Conference is another example.

I went to be part of cool tech, not radical politics. I Haven’t done another conference since.


What I remember hearing about Ruby conferences is they were mainly attended by young socially deprived people so eg there were complaints everyone wanted to play Werewolf instead of going to talks.

On the other hand DHH seems to have a bad temper so banning him might work out.


Conferences aren't worth it.

You can learn anything for free on the Internet.

If you're going there for networking with others then stop. You can't network with blue haired women with daddy issues or men who never graduated from kindergarten.


If this is how you think academic conferences are, I can't help you. (Go ahead, go "learn" the forefront of research in a field of say mathematics "on the Internet".)


I was referring to tech conferences like PyCon and not academic ones.


To spare others a search, 'DHH' appears to be the person credited with creating ruby on rails


And being "banned" appears to mean "he wasn't invited to do the keynote presentation this year".

> Hi David,

> Hope you’ve been well.

> With you having been mostly offline the last year, the program committee has decided it would be valuable for the community to start sharing the opening keynote stage with other contributors. We have a few in mind but if you have any suggestions of people who have been impactful this year, please share them.

> If you have any questions, please let me know.

> - Evan


> appears to be the person credited with creating ruby on rails

That is a very strange way to say "is the person who created Ruby-on-Rails"


No shade intended, phrasing reflects the minimal diligence I put into decoding the acronym.


I wonder if anyone can provide an example where collective guilt was meaningful or productive in any way.


The guilt of original sin driving individuals towards redemption via organized religion thereby allowing increasingly large, stable civil structures in Europe.


Augustine and a clear exposition of original sin was in the late 4th-early 5th century, about four hundred years after the Roman Empire had already effectively integrated quite a bit of Europe and Asia into a diverse and sophisticated civilization that wouldn't be seen again for another millenia.

Around six hundred years after Augustine, Europe had William the Conquerer and the Holy Roman Empire, I guess? Five hundred years after that, Europe had the Renaissance based in a reclamation of the heritage of the ancient world.

I don't think original sin did a very good job of achieving the integration of anyone.


https://en.m.wikipedia.org/wiki/Original_sin says "The belief began to emerge in the 3rd century..." which is early enough to establish mild causality for Emperor Constantine. Remember, he didn't have to believe it only to suspect that it would be useful.

The whole thing was meant to be tongue in cheek. :)


I hadn't thought in terms of collective guilt including everyone




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: