I'm always a fan of 80% solutions implemented in C. I think the library is pretty nice, and as someone who's experimented with trying to add quality of life features to C99 with macros I think the implementation is actually pretty clean and readable as far as these kind of implementations go.
That being said, I think the ergonomics of the approach taken by Sean Barrett in stb_ds is much nicer than the #define ALG_TYPE, #define ALG_PREFIX approach, although I'm sure it wouldn't be a big stretch to make something similar that was compatible with it.
My feeling is that you could get pretty far with something like Sean Barrett's stb_ds + Salvatore Sanfilippo's sds and then a generic algorithm library like this one
I think the difference is that stb_ds is a datastructures library and this is for algorithms. This algorithm library will also look like a completely regular C library if you just use separate .h and .c files to instantiate declarations and the implementation, I think it's very clean.
It would be interesting but I prefer the approach in this post - because the templating macros are isolated out into a "module" and it does not great spread across the user code.
I think our projects are completely different and don't have the same goals.
1. You don't implement the most important algorithms. For example `stable_sort` for arrays. So right there, it's not even a substitute.
One of my motivations from the start was simply to have `stable_sort` in C (to implement database queries).
2. Yours is a clever implementation of C++ features using macros. But it's not idiomatic C. If someone (or their team) wants to use your system, they must learn your macro language.
My file can be included by someone who only knows C array, and will not intrude on any aspect of their C programming style.
3. Mine is intentionally not a data structure library, and avoids making allocations.
There are countless data structure libraries for C. Very few catch on. That's because C programmers tend to manage the life cycle of data themselves. Furthermore, data structures are inter-woven with their data, as opposed to being separate containers.
Besides that point, I see no advantage to yours over other data structure libraries. For example, the best C hash map implementations are written for uint64 keys. Fixing to a concrete type allows for tremendous optimization.
I applaud the efforts in your project and wish it success. Our projects happen to draw on similar source material, but that's about it.
Well, you do support only vector containers, by having the limitation of first++ as only iter increment.
Your name "array" clashes with the C++ array, which is a compile-time sized vector. And your array is unsafe sized, as last is the only size provision, even unsafer than C++ with its unsafe iterators. With C++ you can it least still bounds check against the vector size.
Your variant being static is indeed nice, the STL, CTL cannot do that, just MTL exists for that. Which supports all containers.
E.g. Non-vector types allow for much faster inserts.
You can only replace insitu, which is the opposite of functional. No backtracking. Manual copying needed. The spirit of the STL algorithms is to allocate, not overwrite.
Re hash-map: this is not even finished. For now it's just a pointer-stable unordered_map with optimizations for all types, primitives or strings. optimized Swisstable (string) and Stanford hashes (u64) are in work (no time for that), and are indeed multiple times faster.
> And your array is unsafe sized, as last is the only size provision, even unsafer than C++ with its unsafe iterators.
Can you explain more? C++ <algorithm> is intended to be used for C pointers as well. The iterator concept is an abstraction of pointer.
> You can only replace insitu
I'm not sure what you mean. The replace/remove/unique work just like the C++ versions. The STL recognizes there are cases when you want to work in-place, and others when you want to use a separate buffer. That's why it provides multiple versions.
> the spirit of the STL algorithms is to allocate, not overwrite.
I don't believe this is true. It's designed to separate memory allocation concerns from the algorithm. That's why a lot of them require you to provide a buffer satisfying some requirements.
Arguable void* is the more idiomatic C implementation, but as you said in the README, compilers still have trouble optimizing this (mostly due to bad decisions regarding function cloning).
On a more constructive side, I think adding some value by giving the poster some feedback or engaging in a conversation about the topic would be nice, otherwise it just sounds as an attempt to show off.
I'm not a fan of these "header only" projects nor am I a fan of overly clever code because even though it's fun and rewarding to write it's not a great fit for larger or collaborative projects where consistently adhering to a coding style (hopefully one of the secure ones) and favoring readability over "coolness" are the way to go.
People love to blame C for all of its footguns but not a lot of people like to write "boring" code which kinda sucks...
Please don't take this personally, even though I have to admit that I felt somewhat irritated upon reading your comment it is my no means my intention to attack you with this rant.
I think it's not polite, if you just want to show something cool you did you could create your own thread. You don't interrupt a street musicians and start playing in front of the crew just because you think you are a "better" artist or that your version of the song is cooler right? It's common courtesy.
On a side note, I looked at both codebases and even though I've already said that I wouldn't use either of them I have to say that I find rurbans to be somewhat messier and uglier so I have to disagree with you.
Your attitude may apply to some situations, but in general preliminary judgements kill innovation. Next time after reading your comment, someone somewhere may remain silent without showing off their art, and the world will be missing the next Copernicus, Gauss, Wozniak, Stroustrup.
If I have to choose between "polite" and "innovation", I choose "innovation" any time of day. This is the same kind of difference as to "diss" someone for no real reason as opposed to letting someone just be and "blossom" so they could share their enlightenment with everyone, and thus benefit us all.
Well, you know, there is a way to share your work without putting others down or being dismissive.
It's really valuable (and sometimes even enjoyable) for others to read (or even participate in) a conversation between two (or more) knowledgeable individuals who share their perspective and explain why they made the choices the made in their implementations (hint: here's where he could share some of their work!).
In fact, this is the polar opposite of that and it's just sad.
I feel like rurban's comment is more likely to prevent someone from showing off in the future. Like what's the point when someone is just going to drop by and say "Mine is better, <link>" and then peace out with no constructive discussion?
I wonder why it's a so common pattern here that everytime someone shows their product, someone in the comments will show their own version in a not-so-implicit patronizing tone
>I wonder why it's a so common pattern here that everytime someone ...
it's only common for normies (ok, I guess that is a lot) to expend too much energy looking for emotional content. Just pay attention to the facts, here's a library/framework, here's another library/framework. See? now you know about two. Want emotion? ok, one guy seems proud of his, another guy maybe too proud of his? So what, you be the judge, look at the code.
that is very emotional of your part using borderline insults
You don't need to spend any energy looking for emotional content when it's very face front ego inflated "mine is better" without any constructive feedback.
* STL algorithms typically operate on half-open ranges bounded by iterators [first, last). This convention is not used as often in C*
I don’t think this is true. p != end is a very common check in C?
Regarding composition, I suppose it’s kind of true. The string functions in the C standard library for example unfortunately often return null instead of end. I use functions that return end instead, which makes C code much more elegant.
Nice work! Have you considering combining ARRAY_ALG_TYPE and ARRAY_ALG_PREFIX into a single function-like macro? ARRAY_ALG_TYPE_PREFIX(int, intv_), for example, can set up all the other macros. I would also recommend adding a ARRAY_ALG_ prefix to the NAME1, NAME2, NS, and T macros so that you don't redefine any previous macros with these names.
It's a lot like some of the other decisions by the standards committee; they see demand for a particular feature $FOO, then add in support for something not quite like $FOO, which maybe addresses part of what was wanted but turns out useless in practice so that few use it.
They added `const` (which, admittedly, does get a lot of use in practice) which is subtly different from what already existed in C++ at the time.
They added VLAs, then had to walk that back because, like `gets()`, there was no practical way to use it safely.
The added _Generic, which requires the typenames to be known in advance.
They added threads, but without wide enough functionality to use it for anything but the most trivial of programs.
All in all, they've been doing a poor job of steering the language in the last 20-odd years since C99. What users have been clamouring for was "less UB, please? Make it IB where you can", but instead they have been adding UB.
In the words of Linus Torvalds: "Yeah, let's just
say that the original C designers were better at their
job than a gaggle of standards people who were making
bad crap up to make some Fortran-style programs go faster."
Original K&R C had no reference to undefined behaviour. The worst you could say about C at that point in time was that some things were defined by the implementation.
[1] Some might say more than a little bit :-/ You can't use it to write type-safe generic libraries.
> Original K&R C had no reference to undefined behaviour.
You don't have to explicitly refer to undefined behavior to have it. If you just merely omit defining the behavior of certain programs then that's enough.
> You don't have to explicitly refer to undefined behavior to have it. If you just merely omit defining the behavior of certain programs then that's enough.
I respectfully disagree. It can be implementation defined, as in all other languages.
In fact, when I started with C, I did not use a C89 one, if we weren't certain what a certain construct did, we;d try it on a compiler and use the result as a definition for what that construct did on that compiler!
On another compiler it might do something else, but that didn't bother us until it was time to port the code to another compiler.
With undefined behaviour being added, that no longer holds: you try something out, get a particular result, but the next time you compile that construct you could get a completely different result.
> In fact, when I started with C, I did not use a C89 one, if we weren't certain what a certain construct did, we'd try it on a compiler and use the result as a definition for what that construct did on that compiler!
This still doesn't mean that the behavior was implementation defined. Implementation defined requires the compiler vendor to document the behavior. However implementers are allowed to define and even document the behavior for operations that the standard leaves undefined. These are called extensions. It's common practice to have some extensions even now, sometimes behind specific compiler flags, or just being documented.
I think it's a mistake to infer the compiler's behavior through trying a few examples, even for simple compilers. If you want to have guarantees for the behavior for otherwise undefined constructs then get it in writing from the compiler vendors. Even if you check the source code, the behavior might change in a future version.
With _Generic, the original library implementor has to accommodate your type. If it's a “template”, the user can insert their type. Also a type that's created long after the original library author has finished their work.
"header-only" is the closest thing C has to a (good) package manager. The build ecosystem is terribly antiquated, so people use header only as a way to skip the integration step
I assume you are not familiar with modern C, but I think it's a good opportunity to explain.
A single header style allows you to customize the library with the preprocessor before `#include`. I use this ability to implement generics. Otherwise, you're stuck with void* or code generation (want my python script instead?).
However, you don't have to forgo the benefits of separate compilation units. You can include the declarations in a header, and the implementation in a separate C file. No other parts of the code base will be impacted.
That isn't modern C. It's the cool kids backporting a misfeature from C++ because that's what all the game devs do. Header only serves no purpose in C other than to demonstrate that you can't drive your tooling. Plus you've created a new burden to keep track of the magic define that activates the definitions in the code and the one file where it's invoked.
KISS is a feature indeed.
I like to think that I can drive my tooling, and I do appreciate not having my choice of build system being affected by this library.
I understand that some libraries are themselves so big and complex that they need a build system. Each library I want to use that doesn’t is a blessing.
The issue here is people too lazy to tell their compiler where to find a header file so instead they cook up a bespoke system that is specific to every library using this approach.
That is not why this project is header-only. It's so you can include it multiple times to produce multiple specialized types; it has to be header-only so it can produce a series of structs and functions for each specialized type into your compilation unit, with the compiler separately checking each produced specialization. You configure it before including using a series of preprocessor defines. It's a common way to handle generic types in C.
This can't be a single separate compilation unit that you link in; the generic specialization doesn't work that way. You seem to be talking about libraries that are header-only purely for convenience but this is not one of those projects.
With a library like this, you'd want to have one .c file in your project that produces all the implementations for your specialized types, then link that with the rest of your project. From a quick look at the doc, this project supports this.
> one .c file in your project that produces all the implementations for your specialized types
That depends on the compiler, linker and desired linking being static or dynamic and resource concerns. If dynamic, the entire specialized/monomorphized library could be loaded for a single function compared to static linking with the "-O2 -flto" options that removes unused functions.
If I am right, in C, it is not possible to implement fast type-safe generic algorithms without stuffing everything in headers. libc's qsort can be implemented in a separate .c file but it is a lot slower than std::sort as a result.
pjmlp is(was) a head of commercial development in a big corp that has a natural tension with rival compiler technologies. I think such disclaimer might be more ethical, but it's up to your preference.
I feel sick everything time when I see C++ related stuff, simple & fun programming got so complicated just because some big name "committee members" who insist to spend 30 years to standardize networking/filesystem/reflection refuse to simplify their craps. yes, they haven't got that completed yet.
I guess we can wait another 30 years and there will be a C implementation of the C++ reflection or networking?
I have nothing against progress itself (although I'm not sure why you brought social progress into this), but not everything has to be everything. XML parsers are useful, but surely you wouldn't want one included in libstdc++.
And I'm not saying there isn't room for improvement. There are lots of genuinely useful features C is missing, especially regarding static analysis. I would love to have a proof assistant that can integrate with a C/C++ compiler to prove equivalence of functions, so you could write an obvious version and then transform it step by step to an optimized version, which is guaranteed to have the same observable behaviour.
> Being wary of change is usually the excuse to perpetuate awful systems in both technology and society.
Do you have any evidence to back that "usually" claim? I'm from a country that went through the horrors of malicious forced social change (under Communism), and most people who warm against it just don't want it ever to return.
> I'm from a country that went through the horrors of malicious forced social change (under Communism)
I call communism cancer, let's be crystal clear about that first.
Because of such permanent hate towards communism, I am deeply concerned as more and more countries are developing into de facto communism. Huge amount of $ were handed out to the public during COVID-19, in normal days you also see nonsenses like public housing and so called free health care which are actually covered by other hard working average tax payers - they are forced to cover other people's problem. In the end, many people got brainwashed to believe that there is a state that looks after everyone. Such typical communist way of thinking is horrible at best.
I'd willing to bet that your country is just one of the above described one. You should be very concerned. Communism isn't dead, it just got itself a slightly different skin.
What you're describing is socialism. It can be dangerous if goes too far, but it's nothing like communism.
In practice, the biggest traits of communisms were (of the top of my head, I might be missing something):
1. No private property of the wealth-generating kind. The land, the companies - they can only be owned by the state.
2. No families. The whole country is one big commune, and there's no reason for a mother to favor her own child over some stranger in Novosibirsk 5000 miles from her. (This was too radical though - it turned out that even the greatest opression and terror cannot break the familial ties. The communists abandoned this idea within a dozen or so years of them seizing power).
3. No faith. Churches were demolished or turned into grain stores etc. The clergy was persecuted.
4. Independent thought is forbidden and can easily get you killed - either by shot in the head somewhere in the basement in e.g. NKWD HQ in Moscow, or in one of hundreds of Gulags across the Soviet Union.
5. A corollary to the above - any independent action is highly suspicious. If you and two neighbors start tending to the local garden together, it's already suspicious, because you're starting an "organization" - and all forms of organizing are strictly controlled and must be introduced in a top-down manner. The reason behind this is of course to keep people extremely atomized, so that they're powerless against the state and will never rebel.
5. The country is a police state. Czeka, later rebranded as NKWD, and later rebranded as KGB (the constant rebrandings were due to atrocities that those people were constantly committing) is the most powerful organization in the state. They have informers everywhere, likely among your family, and most likely among your colleagues at work or school.
6. Traditions are dangerous and must be destroyed. Soviet Union displaced many of its nations (yes, entire nations) within its borders, to uproot them and weaken them. Once people's ties with the land were gone, it was easier to make them into obedient workers in state's companies.
7. The state used people as fuel. Esp. during the 1920-1945 period, Russia had more people than other resources, so the authorities turned milions of people into slaves, who worked without compensation on their great industrial projects across the Gulag (building river canals in the arctic regions, building railways through Siberia, mining uranium in -60 degrees weather etc). To save money and accelerate the country's growth, the slaves were malnourished by design, receiving less than 50% of calories they need. This resulted in them slowly buring their own tissues, which resulted in grown men weighing as little as 80 pounds after a couple years of slave work. This was extremely similar to Auschwitz and other German concentration camps, with the exception that Soviet authorities were predominantly doing this to their own people.
8. Terror. During periods of intensified terror (which could last years), no one was safe. You could be arrested at any point under most frivolous pretense (e.g. not clapping long enough after a speech of a political leader), and many millions were. The arrest would most likely lead to you ending up in Gulag as a slave and fuel for the industrial machine. What's worse, your family, petrified with fear over their own lives, would distance from you - e.g. wives often denounced their arrested husbands, and broke all contact. Even if you survived the Gulags and your "sentence" was carried out and you were finally free, you were often still a pariah. Your biggest hope was for the current leader of the country to die, and the next one declare him an abuser, which in practice led to an official pardon of people sent go Gulag under the previous regime.
9. The totalitarian aspect of the state and its ideology. Every aspect of life had to either originate from the state or at least be mediated by it. It's still present in e.g. North Korea, where a grown person is allowed to go to work, go to the store, go the local party meetup (daily meetups after work are often mandatory), perhaps go to its immediate family - and that's it. If you go (as in, physical movement on the street) anywhere else, it's suspcious, and the police can stop and question you.
The totalitarian aspect was also prevalent in intellectual life - if you were working on something that the state deemed politically benign and was hence permitted - let's say you were a university professor writing a biography of Michelangelo - you still had to write it through the lens of Marxism, or you'd be in trouble. Marxism had to explain events in Michelangelo's life, as well as the society around him.
BTW, it's a great shame that this is not widely known across the Western world. This would motivate people to avoid the horrors that could be unleashed if we go to far towards the leftist utopian (in reality, dystopian) ideas. Unfortunately, the Western intellectual elites have decades of tradition of downplaying that, and being in favor radical leftism. Let's hope we won't see people Gulags in the US in XXI century.
If we're going to discuss the term, none of that is actual theoretical communism.
The founders of the USSR were adamant that true communism wasn't achievable without first going through all their bullshit committee rule authoritarian stages.
As for socialism, there's another heavily abused overly broad term used to sweep all manner of not actually good for the people systems under a label.
Many G20 countries have sound implementations of good social policy that in countries such as the USofA would be described as "far left", "socialism", or even "communism".
I'm no fan of the USSR, post USSR, CCP states .. but it feels very odd to call them communist when they're so much at odds with self-governance, local communal control, and so many of the things discussed as communism prior to the October Revolution.
Communism was what people wanted, a boot on the neck was what they got.
People wanted Communism in great many countries. Easily a couple dozen of them? It ended with terror or at least tyrany in every single case. It shows that communism is actually impossible to attain and will morph into tragedy every single time. It makes sense - introduction of communism requires a violent and radical revolution. People who are twisted enough to be willing to try that, the Robespierres and the Lenins, are actually tyrants at heart, and they will never let go of power once they attain it. This pattern has repeated over and over again through history.
You are conflating 'violent revolution' with 'communism'.
Typically ALL revolutions end badly, ala 'Robespierre'. NOT just communist revolutions.
Also typically, almost everyone in the USA, since the US had one of the very few examples of a successful revolution (geography helped), think that revolutions are a great thing, a big party, everyone should do it.
After/throughout WW1-WW2 a lot of people were oppressed, poor, and pissed enough to revolt, and during that period there were a lot of ideas we lump into 'communism'. So, this ferment of anger, seeded with the common ideas of the time, lead to a lot of communist revolutions.
But most revolutions fail, that isn't indictment of the original ideas.
At the time, even Woodrow Wilson's 14-points, would sound like Communism to todays American. In democracy, everyone gets a vote, in todays America, even that is too communist.
I don't know what you mean by evidence, but "things have always being this way" is too often the excuse to avoid change. Social change is not necessarily always good, but rejecting novel policies just because they are not currently "the norm" is not a good principle. Following that line of thought, the Eastern Block from the 90s would have never abandoned Communism because it was "the default" for them.
Keep in mind that when boost features are eventually adopted into the C++ standard library, it is often only after their implementation has been strictly defined in ways that diverge from the boost implementations, which are often inefficient. Boost contains a lot of full-featured but inefficient code; for example, some Boost tools allow for arbitrary memory allocation which is often undesirable and their standard library versions do not allow for it. There are other design issues as well, such as how to consistently handle exceptions in edge cases, etc. Boost versions of some of these don't as comprehensively consider those situations.
Of course, but aside from standardisation defined details, it also depends on your compiler's standard library implementation, and those are not all made equal, especially if you've got to build something both for Windows and Linux.
Well, your compiler's standard library implementation should follow the C++ standard, and if your code expects what the standard requires, in most cases you should be fine. The problem is when code is written with certain behavior in mind that is not in the standard because the developer has gotten used to a particular compiler's liberties that extend beyond the standard, and then if they port to another platform, complications can ensue.
There is no reasonable solution to reflection. That we don't have an official solution here is an indictment of the C++ update process. Look to projects like Unreal Engine to see the horror and lengths that real projects have to go through to get around this.
I thought (and still think) Objective-C is quite nice. Kinda hoping my employer wants to port (wrap) it’s C++ based library to Objective-C (and Swift) in the coming year. I’d be the primary guy doing the work and I’d enjoy it :)
FWIW, I'm a non-Apple developer: never wrote a single line of code for any Apple device for production. Ported some of my libraries over to Mac OS about a decade back, but that was it.
That being said, I think the ergonomics of the approach taken by Sean Barrett in stb_ds is much nicer than the #define ALG_TYPE, #define ALG_PREFIX approach, although I'm sure it wouldn't be a big stretch to make something similar that was compatible with it.
My feeling is that you could get pretty far with something like Sean Barrett's stb_ds + Salvatore Sanfilippo's sds and then a generic algorithm library like this one