Hacker News new | past | comments | ask | show | jobs | submit login
Meson Build System 1.0 (mesonbuild.com)
150 points by TangerineDream on Dec 23, 2022 | hide | past | favorite | 170 comments



Meson still won't have user defined functions, and gives a bad excuse:

> Meson does not currently support user-defined functions or methods. The addition of user-defined functions would make Meson Turing-complete which would make it harder to reason about and more difficult to integrate with tools like IDEs.

No, user-defined functions won't make Meson Turing-complete as long as recursion is not allowed.

User defined functions make it easier to encapsulate complicated build logic and reuse it.


Anecdote:

While backporting a new PipeWire release earlier this year, I discovered that it had added dependencies on new Meson features that didn't exist in Debian Stable. User-defined functions would have allowed me to shim those features on the older Meson version, greatly simplifying my work and minimizing my patches' footprint.

I don't know if that situation is common enough to justify user-defined functions to Meson's author, but it was frustrating, and it seemed ironic that a build system designed to solve platform compatibility problems had in some sense become one.


Yeah that's a case where functions could be useful but I think the real problem there is the non-perfect backwards compatibility in Meson, which makes it unnecessarily hard to include new versions of it in stable distros.

The main culprit is GNOME packages because they were early adopters of Meson and thus had a few incorrect idioms that were cargo culted to a few dozen packages. For a similar issue in RHEL we decided to update Meson, but we needed to revert some changes that tightened the language. I do plan to include those changes in Meson in the near future, keyed on the minimum version in the "project" line, so that older packages are easier to build with newer Meson.

That said, did you consider backporting the new features to the older Meson in Debian stable?


> did you consider backporting the new features to the older Meson in Debian stable?

No, since modifying an additional project would run counter to the goal of minimizing the scope of the backport's changes. (Especially important when said project is a dependency of other packages, and could potentially break other things on the distro. This was Debian Stable, after all.)


It should be possible to install meson via pip --user. Even though I prefer system-wide installations, I believe this weakens your argument for user defined functions in your situation.


> I believe this weakens your argument for user defined functions in your situation.

It does not. This backport was not for a single user, but for deployment into the package repo used by multiple systems (including build systems). A one-off Meson build would be unsuitable for this case.

In any case, note that I didn't claim it was impossible without user-defined functions. It was simply messier and more painful, much as a special Meson build would be.


You can also use run meson directly from its source tree with no installation steps, that's what we do in some cases.


> No, user-defined functions won't make Meson Turing-complete as long as recursion is not allowed.

Turing-completeness is, itself, often a red herring. What it alludes to is the difficulty in predicting what a program would do because, in the general case, it is impossible to tell what behaviour a program in a Turing-complete language would exhibit for a particular input short of observing what it actually does. But even languages that are not Turing-complete often fall far beyond what is practically predictable. While theoretically decidable, predicting what a program written in a language with nothing but boolean variables and loops of no more than two iterations, or, alternatively, functions of boolean parameters even without recursion, is PSPACE-complete (as such a language can easily encode TQBF [1]). I.e. predicting how such a program would behave is computationally intractable, since it grows exponentially with the size of the program. Programs in such a language would also fall outside the capabilities of heuristic methods, such as SAT solvers, to tackle (SAT is merely NP-hard compared to the strictly harder PSPACE-hard TQBF).

So if your language has variables and loops (even limited to two iterations), or variables and functions (even without recursion) -- it's intractable, even though it's not Turing-complete. I don't know whether this particular language qualifies or not.

[1]: https://en.wikipedia.org/wiki/True_quantified_Boolean_formul...


The current language has loops and Boolean variables. It is trivial to reduce a TQBF formula into a Meson program.

The entire issue is a red herring.

They climbed up a tree and can't climb down without losing face.

The correct way to handle the situation is to make the change, wait for everyone to cool down, and then pretend that was their position all along.


If the only reason to use CMake is user-defined functions, I see no real reason to use it. The gains in readability of Meson are far and beyond the gains of user-defined functions in my opinion. I maintain a large open source library and haven't once felt the need for user-defined functions.


The excuse is still wrong, and indicates complete misunderstanding of what Turing-complete means.

I write plugins for a large, well known application.

Building the plugin is a complicated multi stage process with many dependencies.

I've spent a lot of time understanding the process (undocumented, only VS and Xcode project files are provided in the SDK).

I now have a single function call to create each plugin.


What is hard about the following?

    # project/meson.build
    project_plugin_dep = declare_dependency(
        include_directories: plugin_includes,
        compile_args: plugin_c_args
    )

    # plugin/meson.build, repeated for each plugin
    shared_module(
        'plugin',
        'plugin.c',
        dependencies: project_plugin_dep
    )
Without having seen what your user-defined function does, this is the best equivalent that I can come up with. If your plugins exist out of tree, then Meson can consume a pkg-config file. Postgres is slowly moving to Meson and uses out of tree plugins.


I also need to generate a file, pass it through a special compiler, pass the result through the system resource compiler, link it with the shared library.

There are a bunch of other steps. I have a YAML description of the plugin which I use to generate plugin skeleton, and so on.

Do you really think it makes sense to repeat that again and again?

And then, what if I need to fix a bug in the process?


Don't bother arguing with Meson contributors. The typical response to realistic use cases like this is to gaslight you into thinking you're doing something wrong.


Not necessarily wrong, but you might have to try doing things differently and it might work or not. Meson is certainly less mature than CMake, on the other hand it's much harder to go down a rabbit hole of "Your scientists were so preoccupied with whether they could, they didn't stop to think if they should" and consequent technical debt.

I can see how that can be frustrating. My experience with QEMU's transition to meson is that it was clear from the beginning that Meson would not be able to do everything we needed, but it would let us focus on the few things that are special (e.g. building tests for dozens of targets using dozens of cross compilers) and simplify everything else.

It was not easy: it involved several contributions to meson itself (that everybody now benefits from, for example Postgres), and we do have to live with a couple limitations of the tool, but the outcome was extremely positive:

* 4000 less lines of build system gunk

* A lot fewer requests for help even from experienced developers

* Several new features that required build system hacking could be implemented much more easily

* A transition that did not a require any learning curve for existing developers, because trivial changes to the build system remained trivial

Of course there is still complexity, QEMU still has 11000 lines of build system code. But it was totally worth it.


I'm sure it is better than the previous system, otherwise QEMU would not be migrated to Meson.

But, you're describing a 11,000 line program without the most basic abstraction mechanism known to man - user defined functions.

What are the benefits of not having user defined functions? none, zilch, nada.

The Meson team is grasping at straws trying to claim that having user-defined functions would make it Turing-complete. Without recursion, it won't.

Try to imagine how you would structure those 11,000 lines of code if user-defined functions were available? Would you really write it without extensive use of functions?


It's 11000 lines (for 2.5 million lines of code or so) but it's really a long list of things to do. Functions to look for, libraries to use, configuration symbols to associate with a specific file.

The only somewhat repetitive idiom is

foo = dependency('name', method: 'pkg-config', required: get_option('opt').disable_auto_if(condition))

to find a library, but otherwise there's very very little repetition of the kind that can be fixed with user-defined functions.

That's not surprising though. Because the user input is very constrained, I find it more natural to think of the build as "first prepare all these things, then use them to prepare all these other things" and so on, not as a complex program. As a very simple example, instead of abstracting "find flags for the C/C++ compiler" in a function and calling it twice, I prepare an array of languages and iterate on it.

It seems limiting but really it's just different and this kind of data flow mindset makes it very clear what is happening when (e.g. what files or arrays are read and which are written), and it also does not need a lot of abstractions (functions).

What it needs is decent data structures, and Meson's DSL is superior to CMake in that respect. So basically the code you write matches the tools you have.

Now I am not saying that Meson is perfect. But the functions thing is kind of a red herring. My #1 complaint is lack of include files (apart from going down to a subdirectory), not functions. #2 is messy handling of targets that generate files in subdirectories and #3 is a silly handling of some options.

The important thing is that none of these things, and not even functions for that matter, are baked in the language. There is room for fixing them in the future, because they are missing features or bugs rather than fundamental mismatches.

To be honest I believe that Meson should have used Starlark as the language, but it did not exist 10 years ago and anyway there is no Python implementation of it.


LOL

There is definitely a religious feeling to some of their arguments.


AFAIK this can be done with a mix of custom targets and adding the result to the input of the shared lib in some way.

What thing you cannot do? I have used capnproto, generated c++ files and passed the result to the input of targets perfectly ok.


Of course O can do that. The point is that I want to do it many times over, in different places in my code.

What I want to do is to spend the time and effort in understanding and debugging the process, and then encapsulating it in a single function call. Then completely forget about the details and use that single function call.

What is the advantage in not having user-defined functions? None, zilch, nada.


Can you use a foreach loop?


I'm sure I can use any number of ugly workarounds.

In many large projects, the build system is necessarily large and complicated. It should be treated like any other code, and would massively benefit from the most basic tool for writing clean, readable code - user-defined functions.

Sadly, as much as Meson is tempting in other respects, without user-defined functions, it is unfit for human consumption.

The excuse Meson uses, that user-defined functions would make it Turing-complete and therefore not statically analyzable is totally false.

Adding user-defined functions without recursion does not make the language Turing-complete. On the contrary, as long as there is no recursion, you can inline the functions and create an equivalent code without functions.


> it is unfit for human consumption.

Surely this is hyperbole since some of the largest open source projects consider it a viable build system. Systemd, Mesa, GStreamer, GNOME, Postgres.

Would I be correct in assuming that you are coming at this conversation having never tried Meson?


> Would I be correct in assuming that you are coming at this conversation having never tried Meson?

I tried to port the project I've previously mentioned here from CMake to Meson build, but could not find a way of doing it with Meson build in a satisfactory way.

The lack of user-defined functions is a show stopper because I have a complicated multis-stage build process. It is also very different between platform. Having to repeat parts of this process is disqualifying.

A programming language, and a build language is a programming language, without user-defined functions is unfit for human consumption.


QEMU also has a complicated multi stage build process, and using foreach was not a workaround. Each directory sets up what it wants to build in a dictionary, and then the main meson.build does all the steps.


So it's like having ad hoc, informally-specified, bug-ridden functions calls, just without argument checking, type checking, localized error reporting, etc?


No, not really. It's just a different, data flow based approach.

The only thing I would use user-defined functions for is to structure the meson.build file in top-down manner, but each function would be called just once.


Can you use a script written in another programming language to generate your intermediate targets?


Sure, I can write a script to do that.

The script can examine the dependencies, see if any of them changed and then rebuild only if needed.

It's like Greenspun's tenth law[1]:

> Any sufficiently complicated build script contains an ad hoc, informally-specified, bug-ridden, slow implementation of make.

But then, what do I need Meson for?

[1] https://en.wikipedia.org/wiki/Greenspun%27s_tenth_rule


That's not what I meant. Of course, if you wanted to roll your own build system, you could do that. But hopefully you're aware that's not what I'm suggesting.

Meson provides custom targets, so Meson itself would examine the dependencies and rebuild if needed:

https://mesonbuild.com/Custom-build-targets.html

Obviously this might not do what you need specifically, but it does deal with some of the use cases for user-defined functions.


Of course I have custom build scripts that build a target from its dependencies. The point is that there are several such targets as part of building a single plugin. It is the build system that decides whether to run such build script and in what order in order to build the final result, depending on which sources and intermediate targets were updated.

The entire point of having functions is to capture a "recipe" for building one type of target, and then be able to reuse it again and again.


Sounds like Meson isn't a very good match for your use case. That's too bad. Good thing there are other tools like CMake which are a better fit for you. It seems like you have also lost some time attempting to port your build over to Meson and failed, which was probably really frustrating. As a result, you really dislike Meson, going so far as to call it "unfit for human consumption".

Many other people like Meson, and many large projects have gone so far as to switch to using Meson. It's great that it worked for them, and it's OK that it didn't work for you.


> Sounds like Meson isn't a very good match for your use case.

That's exactly what I've been saying.

> As a result, you really dislike Meson, going so far as to call it "unfit for human consumption".

Meson got so many things right, but this one point makes it unusable for many projects, and prevents it from being industry standard.

Let me say it again, a programming language for writing large scale programs (11,000 lines is a large scale program) without functions/subroutines is unfit for human consumption.

It is made worse by giving a blatantly wrong rationale for not having user-defined function.

The only reason I even bothered spending so much time in responding is the hope that they will reverse this idiotic position.


I guess it wouldn’t help for me to recapitulate all of the arguments made by bonzini, who actually was involved in the 11,000 lines (down from 15,000 thanks to switching to Meson). He made his arguments very cogently and you apparently either don’t understand them or don’t care to listen.

Clearly there are people who think the position isn’t idiotic, and there are large projects which decided that Meson was a great choice and chose to migrate to it. This means that Meson is fit for human consumption.


Honestly, and I'm saying this with kindness, you are super hung up on Meson not having user-defined functions and their use of "Turing complete". Others have pointed out there are perfectly reasonable workarounds (external scripts and foreach loops aren't really that ugly), but you're just so miffed you refuse to consider them.

A lot of build systems won't support what you're asking here. Most don't do multi-stage builds, most don't have the concept of functions, many rely on external tools (autotools, pkg-config, etc). They're not "unfit for human consumption", which is pretty offensive to the people who chose Meson for their projects as well as Meson developers.

I get that you're offended by the situation. I might even agree with you in principle. But you're all over a Meson release thread, their 1.0 release thread even, shitting all over them, on Christmas Eve! Lighten up!


Does CMake have to generate the file and pass it through the special compiler? Can't it be done separately, before CMake is run?

One thing that I find make CMake configurations completely unreadable is to try to have it do everything, coffee included. Where often a script could be run separately first (e.g. to generate those files), and then CMake could run normally afterwards.


No, in most build systems it often can't be expressed precisely. It can be done in a way that will mostly work or be over approximated, though. So having some amount of abstraction capability is really important. Especially when you have a build pipeline involving something like "compile program G -> run G -> compile output of G", the output of G can often have many dependencies that are not statically known, or you have to repeat the logic of the generator program G inside the build system to declare them correctly, which is often error prone. So you normally just either have some incorrect logic where the build will fail -- because G started outputting code that needed "foobar.h" and someone forgot to update the dependency list to include "foobar.h", so now dependencies aren't tracked -- or some wildly conservative dependency graph that rebuilds things way more often than necessary, "the output of G depends on every header file possible." Now imagine that there are basically infinite variations on this theme and the cycle can continue infinitely, and it gets to be a problem.

Most people either work on programs small enough where having a conservative approximation of the dependency graph is "fine" and excessive rebuilds are OK, they are just compiling a simple static DAG of known source files, and if they encounter these issues, just structure the project with a little repetition between the between the build time dependencies and the code. Or, they just let the module system in their language of language choice figure it out. These are all completely OK. But it's very expensive to do conservative recompilation and very error prone to do manual repetition in larger projects, and gets very complex in multi-language setups.

You can sometimes structure programs to avoid these sorts of problems with enough foresight, but when you can't, or you need these things, you really need them, and the lack of it will really really hurt.


> […] and excessive rebuilds are OK.

Where does the presupposition of just some build tools, but not the others (if I am reading it correctly), resulting in excessive rebuilds come from?

Apart from building a dependencies DAG for a project, the next fundamental responsibility of a build tool is to watch for / detect changed graph nodes and apply relevant build rules (expressed as graph edges) to rebuild only the necessary parts of the graph by traversing the path upward (if the apex node is up at the top). Naturally, if it is a dependency at the very bottom of a very leafy graph that has changed and upstream nodes have a dependency on it, it will trigger a large (of a very large) rebuild of the entire dependency path. But that is true of all build systems, not just select ones, and that is what we need buid tools for.

The problem is that many people either can't or don't express dependency rules correctly or dependencies generate build byproducts that are to be dynamically injected into the DAG, and they not properly accounted for in the build rules, and that results in an incorrectly built DAG at the build system run time resulting in excessive rebuilds.

In the ideal world, dependency management is relegated to a module system of a programming language. But since long serving incumbents cough cough C, C++ and Java (not until Java 9 anyway) do not have a module system, people have created a kludge (make) and later multiple kludges (autoconf/automake, cmake, maven/ivy etc) to bandaid the problem.

However a generic build tool can't account for a permutation of all possible scenarios, all possible programming languages, linkers, pre- and post-processors and whatever else needs to be done to produce a build artefact and excel at it in every possible use case. Some do it better than the others in very specific scenarios but do poorly in some other or fringe use cases. Then there are personal preferences. Therefore, the discourse over which build system is better than the other(s) will never end.


Modules don't expose file level dependencies which is what you want as a build system.


Yes, they do, and I can't think of an operating system (other than OS/400) where a module is not a single file or a collection of files. Could you please reference an implementation where it is done otherwise?

The module system is tasked with figuring out dependencies (i.e. files comprising a module), which is true for all programming languages starting from Standard ML through to Rust and anything in between.


I did not mean to sound like CMake is perfect for all projects. In some specific situations (typically huge projects), I can totally understand that it becomes a pain. That's why big companies like Google tend to have their own tool.

Now it really feels that most people complaining about CMake and using Meson instead are not in this situation. And if they were, it would seem more pragmatic to me to say "CMake is fine for small projects, but at some size we needed something else" than "Meson is obviously, definitely orders of magnitude better because it took me less time learning how to compile my 20-files project".


Okay, except your whole reply to OP started by replying someone who is in this situation and does have this exact problem. Then you go and ask why their solution can't be simple, and when it's explained why, you say, "well, most people don't need this?" ?????? You're talking to people who do need this!

And this isn't something that ultra-massive billion line projects singularly suffer from. You can easily suffer from all these issues and more in small and big software projects alike. I've suffered from these problems plenty in FOSS and commercial software that were small and large. "Computer program that generates another computer program" is actually a pretty normal and fine way to build software. But when compilation is expensive, these things matter, and you can't keep making excuses about it.

A build system should accurately track dependencies and rebuild things. It's that simple. You can say that running two commands is OK or whatever, but don't try to imply that people who want that criteria -- accurate deps and rebuilding -- aren't trying to solve problems they actually have, and then ditch the example when it becomes inconvenient to you that they are.


Well my issue is that I have used CMake quite a lot, with generators, in non-trivial projects (both open source and proprietary), and I have always been fine with autotools, cmake and meson.

I don't try to have cmake do too much, e.g. I don't run the generation (or python scripts or custom functions) from cmake, because it quickly becomes a mess and running the steps manually / in a script is usually easy.

I haven't seen a project yet where the cmake config is not a mess (because let's be honest, most CMakeLists out there are pretty bad) and where this is clearly a limitation of cmake (and not of meson).

Happy to get an actual open source example where that's clearly the case, though.


The build system should encapsulate everything needed to build the software, ideally with a single standard command. If your build system requires you to manually run build steps before or after it, your build system is broken (cmake unfortunately also often falls down in this regard, especially for cross-compiled projects).


So it's fine for you to have to manually run a sequence of steps (start your computer, open a terminal, navigate to the project, run the build, run the program), but it's unbearable if the build is split in a couple commands?

I don't get it. Also shell scripts are pretty good at running a couple commands, if really that's important.


If you start using shell scripts, you are throwing away the whole advantage of build systems, which is only rebuilding what is necessary. Using multiple manual build phases (even if you have a shell script calling them all) also increases contributor friction and is terrible for FLOSS.


Well, I run the scripts only when necessary. If I change an IDL file, I run the generation script. If I change a cpp file, I run the cmake build only.

I actually have maintained FLOSS projects, and in my experience, throwing everything in one cmake project makes it harder for contributors to understand. Because they only ever use one command, and the day something fails they have to dig into it to understand what that custom command does. On the contrary, if they need to run the generation script first, then they know that there is a generation step involved, and suddenly the whole thing feels less like magic.

So that is my opinion: if your FLOSS project uses a big custom build system (because you customized your cmake to do all sorts of custom things), then probably I will not contribute. Because I have other things to do than learning your custom integration of standards steps I actually would already know.


Why do you feel it's better to do the build system's job manually? I've seen beginners interact with such a system and it's a recipe for even more confusion.


> Why do you feel it's better to do the build system's job manually?

What's better is what is easier to understand and maintain. If your whole project is made of one file and builds with one simple call to a compiler, IMO it's fine to just have this line in the README, no need for a custom_target in cmake.

Now the thing is that often, I see that people put a lot of importance into the "it should be easy to get started for beginners who don't know the tools", and everything is glued together using some custom functions. The goal being that a beginner can essentially run "make" (yep, I've seen Makefiles wrapping CMakeLists for that) and the project will build (sometimes it will even install dependencies on the system, that's crazy). But then if the beginner wants to understand what's happening under the hood, well... it's a big pile of custom glue.

Now if, instead of that, you document the tools you are using in your README, and use the tools in a standard way, then people who know the tools will get started quickly ("oh, it's protoc then cmake, I know exactly how to use that"). And yeah, beginners will need to learn how to run those tools. But that's fine IMO, because if you use the tools the standard way (without custom functions), then the beginner can just look for the official docs.

Standard is better than custom.


I don't want to have to find out about and remember a bunch of different commands, and I especially don't want the risk that I get into an inconsistent state by forgetting a step when something changes.


> Does CMake have to generate the file and pass it through the special compiler? Can't it be done separately, before CMake is run?

No, it cannot be done before CMake runs. It is a target like any other build target. It depends on other files or targets, and should be rerun if any of them change, and if it is rebuilt, there are other targets that depend on it and should be rebuilt, and so on.

The entire point of the build system is to concisely and accurate specify the build graph, and only rebuild what is necessary.


Ok let's take a concrete example. Say you have proto files (seems likely in a big system). Whenever you change a proto file, you know that you should run protoc, and then run CMake. If you add a proto file, not only you need to run protoc on this new file, but you need to tell CMake about the new generated file(s).

For the big majority of projects out there, it seems to me that it is perfectly fine to handle that manually, because projects are not that big. Do we agree on that?

Then of course, if you are a GAFAM with a huge monorepo, probably you need more advanced tools (they don't build their own tools for nothing). My feeling, though, is that most people complaining that CMake is crap actually work on small projects and can't be bothered to run protoc manually "because their philosophy is that it should all work with one button".


No, I don't agree that having to manually run protoc after modifying a proto file is even slightly reasonable. That is exactly the sort of thing which the build system is there for, and the idea of not spending the few minutes it takes to automate that is baffling to me.


I automate it by putting it in a script that I call manually when needed, instead of spending days trying to have cmake run it.

Now it's not a pain for me, and apparently you're on the side of those who find cmake painful. Not sure how unreasonable my way is :)


> For the big majority of projects out there, it seems to me that it is perfectly fine to handle that manually, because projects are not that big. Do we agree on that?

Let's also agree that every time you change a .cpp file you compile it, and every time you change a .h file you compile all the affected .cpp files.

No we don't agree on that. That is the entire point of having a build system.

Even proto files you gave as an example, may import other proto files. When you change one of the imported files you have to recompile all proto files that import that file, transitively.

I have, unfortunately, worked with build systems that do not capture all dependencies and systems that capture way too many dependencies. The former resulted in many hours spent debugging non existent problems, and the latter resulted in many hours spent waiting for compilation to finish.


> One thing that I find make CMake configurations completely unreadable is to try to have it do everything, coffee included.

and to me, build procedures that aren't completely implemented in cmake but require to install who knows what shell / platform / language / etc... in addition are the bane of my life as they make everything harder and more brittle and less cross-platform.


I don't think it's a bad excuse at all. Limiting computational power as much as possible is a good thing as demonstrated by https://langsec.org/.


User defined functions without recrsion do not increase computational power.

And they definitely don't make a language Turing-complete.

They just make large code bases tractable for humans.


If you need custom build functions like that, neither Meson nor CMake are a good choice. If you have complicated build logic, you'd be better off with something more advanced, like scons, waf, bazel, ant, buck, etc.


Why? I do that in CMake.

God awful ugly abomination of a language, but it does the work, and is almost universally supported.


Same story here: complex build scripts in CMake, horrible language but at least it has some form(s) of user-defined abstraction. Tried to look into Meson, bumped into this issue, and decided against it for the same reason.

I kinda get the rationale that no abstraction primitive should drive common patterns into the standard Meson distribution (an approach which has problems too, but at least it makes sense to me).

However, seeing function definitions rejected on the basis of Turing-completeness raised serious doubts about the project at the time, and things haven't changed much on this front apparently[1]

[1]: https://mesonbuild.com/Syntax.html#userdefined-functions-and...


I am not sure what stops you from using a wrapper script (in Python, portable) and invoking it as a custom target. It works and it is better than much of the CMake juggling.

I think there are a lot of exaggerations around here. I have been a Meson user for long moving from CMake. CMake is quite more frustrating.


> I am not sure what stops you from using a wrapper script (in Python, portable) and invoking it as a custom target

Nothing, when applicable this works and we're doing it already, thank you.

But it doesn't as soon as one needs to abstract over CMake commands themselves (like add this target, set that target property, add this test and so on). Of course we could try to generate CMake files in wrapper scripts, but this is another level of CMake juggling entirely.

> CMake is quite more frustrating.

Could not agree more.

But even more frustrating than CMake programming is Meson checking almost all the boxes yet refusing to provide that last one bit of functionality that would make me switch to it (and giving a wrong justification for it).


You just made a much better argument for why to use Meson than the Meson guys themselves made.

At the end of the day, I don't want to have to know any build system let alone CMake. Period. Full stop.

I want my build system to be declarative. That way, when it breaks, and it will break, I can fix the dumb thing. (Here's an example of the kind of breakage that CMake has: I can't compile with "zig cc" and friends, because there's a space in that compile command and CMake has baked-in parsing. Whoops. Yeah, CMake is slowly fixing that, but it's going to take forever to roll out a fix for something that shouldn't be in scope of the build system.)

The fact that your project's build system needs complex features should be treated as a gigantic red klaxon to me, as a user, to stay far, far away from it. Needing to use complex features for build means that you need to reorganize your project and/or your "build system" needs to be broken apart into phases that are containing the complexity.

I am picking on CMake here, but this is a failure of any build system that tries to do "too much". Cargo, from Rust, has similar problems.


The ugly syntax is mostly a maintenance issue, but the real problem is that CMake doesn’t expose all of the internal build system “machinery” that you need to build something complex or performant.

Take waf for example: it’s effectively a Python framework for creating a custom build system, with every single internal feature accessible (and documented) with good design instead of abstractions. In waf, you can do things like:

* define custom build Tasks

* have those Tasks spawn other Tasks dynamically (e.g. by parsing the stdout of a compiler to find output files)

* Extend/modify the behavior of existing Task classes (e.g. the C/C++ linker)

* Modify the task scheduler to control execution order

* Implement commands that analyze your build scripts (e.g. to generate a compile_commands.json file, or anything else)

* cache any serializable Python types between build invications

* create reusable project-agnostic build scripts

* wrap everything in a nice API so that build scripts are manageable

* rich debugging features

Not to mention the expressive syntax of Python and the full power of the standard library.

And all you need is a Python interpreter (2.7 and 3+ both work), as Waf has zero dependencies.

The learning curve is high, but it’s worth it for projects that have a lot of build complexity. In my case, I use CMake until a project needs custom functions, or needs to support multiple languages, then I switch to waf.


Just curious, could you give a concrete example of what waf provides that CMake doesn't?


Say your program requires some static data stored as XML files that link to each other (e.g. XLink), you need to convert them into a different in-memory representation, and then compile them into an object file for embedding into the binary, and maybe generate some header files to access the data. Maybe you even want to compress the data at build time, and link with zstd to decompress at runtime if the user passes a `--compress-data` flag when building.

In waf, it's possible to implement everything you need for that in a highly efficient way: the 'scanner' method that parses XML files to find linked resources to setup dependencies, the task that generates the header file with optional decompression API, etc. Because the task scheduler is flexible and programmable, you can implement it so that nothing runs unless it absolutely has to. Even the dependency scanner can cache data between builds so that it doesn't need to re-scan for dependencies unless any of the files it found in a previous invocation changed (hash, timestamp, etc)

For example: MyData.xml links to CoolStuff.xml; user modifies the whitespace of CoolStuff.xml, so MyData.xml needs to be rebuilt; the Task that converts it into a new representation runs, but since whitespace didn't affect the output of that task, the output file's hash is identical to what it was before, so the other tasks don't detect a change, and the build ends immediately.

In CMake, it would be impossible to implement something comparable. With custom functions/external scripts you can technically do anything, but there's a lot of inefficiency both in runtime and development effort. Since CMake doesn't actually perform builds itself, it's limited in what you can directly do, and the DSL isn't powerful enough to take advantage of more flexible CMake internals even if they existed.

But if you only need to compile C and C++ files with some light scripting, CMake is far superior for a lot of reasons.


Is it really a bad excuse, or is it a polite way to say "Why the hell do you need user-defined functions?". At least that's how I would say it myself.


The reason you need functions anywhere else. Encapsulate complicated build logic and reuse it.


Sure, I just question what kind of complicated logic you want to put in there.


Big projects have complicated builds. Here is just one file out of dozens (or hundreds?) that make up Qt's build: https://code.qt.io/cgit/qt/qtbase.git/tree/cmake/QtModuleHel....


Sure. My point is that most projects are not big projects. And most people complaining about CMake are probably not working on big project either. Just look around in this thread: people don't say "I use Meson because I work in a GAFAM-sized project", but rather "Meson is soooo infinitely better because the syntax feels more natural to me".



2019 was the last time I looked into the CMake vs. Meson debate and while Meson looked 100% saner, cleaner, etc. the network effects of most of the popular C++ libraries using CMake made me end up picking CMake. I'd be curious how the landscape is today but I now have stockholm syndrome with CMake especially using vcpkg.


Meson integrates fairly well with CMake. The CMake support is maintained by a student however. Said student has been busy for a bit, so some bug reports and feature requests have sat dormant. My needs are 100% met. A ton of popular C++ libraries exist within Meson's WrapDB however.


My experience with trying to use Mseon's CMake integration is that it works just well enough to trick you into thinking that it's a good idea to use, but not well enough to make it actually a good idea.


Yeah I don't see the value proposition of Meson to be all that compelling. It's basically just CMake, but with a different DSL. I've never had a problem with modern CMake, and whatever minor productivity benefits there might be from Meson are far outweighed by the popularity of CMake.

Even if Meson is more "powerful", there are other build systems out there that are better options than either of these.


Perhaps you shed some light on what the better build systems are?

Meson is much more than just a different DSL. What is the CMake standard option for toggling between library types? What is the CMake standard option for toggling lto? What is the CMake standard option for enabling sanitizers?

Meson standardizes so much behavior. I can go to any project and immediately be able to configure a build with the parameters that I want.


> What is the CMake standard option for toggling between library types?

-DBUILD_SHARED_LIBS=<ON,OFF>

> What is the CMake standard option for toggling lto?

-DCMAKE_INTERPROCEDURAL_OPTIMIZATION=<ON,OFF>

> What is the CMake standard option for enabling sanitizers?

There isn't one, sadly. Though, you can set CXXFLAGS to configure whatever behaviour you want.


> Perhaps you shed some light on what the better build systems are?

To clarify, I mean "better" for complex tasks. Build systems are just tools like any other, meaning they're good at some things, but bad at others. CMake (and Meson) are good for simpler build scenarios, but when you need complexity they're a poor choice. For that you're better off with scons, waf, bazel, ant, buck, gradle, etc.

> Meson is much more than just a different DSL. What is the CMake standard option for toggling between library types? What is the CMake standard option for toggling lto? What is the CMake standard option for enabling sanitizers?

> Meson standardizes so much behavior. I can go to any project and immediately be able to configure a build with the parameters that I want.

Again, my issue with Meson is that it doesn't bring anything substantially new to the table to justify it. Having a nicer DSL isn't enough, and having some built in toggles for common things isn't enough. The fact that you happen to be more comfortable with Meson doesn't make it objectively better than CMake. Figuring out how to enable LTO and sanitizers for a CMake project might take a few minutes of Googling at most.

A new dev who has never encountered either of these might instinctively prefer Meson's syntax, but the wiser decision is to learn the (vastly) more mature and prolific CMake, and then not have to worry about it. Time spent on the build system is time not spent on the project.


Don't know about the others but for LTO on CMake there is https://cmake.org/cmake/help/latest/prop_tgt/INTERPROCEDURAL...


> It's basically just CMake, but with a different DSL

... and loads of extra dogma about how projects should be structured that don't match reality, paired with a rude and dismissive BDFL maintainer.


Yep, I've had this problem bite me and I had to go with CMake instead.

I like Meson's python-like language much better, though.


I agree, and I'd use it over Cmake if it didn't artificially limit what I could do with it.


Popularity is a double edged sword. There's more stuff done with cmake but that means more legacy to support which can prevent innovation. Meson is able to build on the lessons learned from cmake.

Ultimately, it doesn't make sense to worry too much about picking one or the other though.


Did you use both extensively? Or you just talk from a-priori opinion by the looks?

I spent a ton of time adjusting toolchains and doing string handling, regex handling or command invocations with escapes, lists vs strings etc. in CMake.

Dnt get me started with options vs set in CMake and caching...

The waste of time was so massive and frustrating that I ended up moving to Meson.

It is working remarkably better.


I looked at these two, not just 2019 and meson looked much more difficult to use, and super-confusing, to the person who needs to perform the build. In terms of writing CMake vs Meson - I've only written CMake, but I've tried to look into meson files to understand how to tweak something, and it was hard for me.


I happen to have had the more neutral perspective of being a beginner at both.

Meson was easier to pick up. Not nearly as easy as I feel it should be (the idea of a meta build system when you target a single platform has always struck me as insane), but still easier than CMake.


> (the idea of a meta build system when you target a single platform has always struck me as insane)

who targets a single platform in 2022 ? I have honestly never in my professional life worked on non-embedded C++ projects that only targeted one platform (and honestly, even those had to run on host computers with people developing from any host OS, Mac / Win / Linux) ; on the other hand I had a lot of time people ask me to have support for working on the project from either Xcode, Visual Studio or others IDE. How do you do that without a meta-build system?


> who targets a single platform in 2022 ?

Me, in the last 3 jobs I worked on. In fact, the majority of my career targets a single platform. I believe it's common for applications to target only one platform. Even when Windows is the first such platform. Heck, I've even read some advice about re-writing your application instead of trying to make it multi-platform.

Libraries are different, though. And that application you plan to rewrite 3-6 times should probably offload the entire business logic to a truly multi-platform library. So I guess meta-build systems could possibly make some sense there.

Still, what I've seen in practice is an extremely shallow layer of non-abstraction that just hides what actual commands happen under the hood. I know how to call a compiler on the command line, I know its inputs and output, and I can reuse this knowledge when I write a Makefile. CMake, not so much.

> on the other hand I had a lot of time people ask me to have support for working on the project from either Xcode, Visual Studio or others IDE. How do you do that without a meta-build system?

My feeling is that it's not the job of the (meta) build system to support the idiosyncrasies of the latest IDE (or editor) of the day. It's the job of the IDE to provide what users need to call the right build command for the right situation (build debug, build release, clean…). And for fancy stuff like auto-completion and jump to definition, the IDE should provide the necessary magic independently of the actual build system used.

If this independence between the build system and the IDE cannot be achieved, then maybe the language is the problem. I know, people are doing real work and have no time for long term plans. Just, the absence of a short term solution doesn't mean there is no problem.


Lots of places who develop desktop software will develop only on that platform for that target. It might be easier to just maintain a visual studio solution if you're only targeting windows and developing on windows.

Early in my career, I worked on a project that built for 4 platforms, not all x64, and was managed through msbuild, i.e. it just worked in visual studio.


CMake is also a meta build system, so I don't understand your point. Meson targets many platforms, some better than others.

* Visual Studio Projects * Ninja * XCode


Not the GP but I read their point not as pro-Meson but more that both Meson and CMake feel like overkill for a single platform/architecture project.


If you are the only contributor, targeting one compiler, on one platform, then yep, I could see how this might be overkill.


I don't understand, how does the popularity of CMake benefit you when using it as a build system? Because from my experience, they have gone out of their way to break any sort of network effects. Most recently they added the whole "wholly new target based CMake" stuff and now the entire ecosystem is split into two and declaring any CMake version except the exact one you are using as compatible a full crap shoot. That's strike 2 after never integrating with pkgconfig and instead having you write shitty FindFoo files for everything.


> they have gone out of their way to break any sort of network effects. Most recently they added the whole "wholly new target based CMake" stuff

That was in 2014. If that was their most recent screw-up, they're doing pretty good.

> That's strike 2 after never integrating with pkgconfig and instead having you write shitty FindFoo files for everything.

There is pkgconfig integration [1], though most libraries autogenerate CMake config files these days.

[1]: https://cmake.org/cmake/help/latest/module/FindPkgConfig.htm...


It started in 2014 - it's ongoing today, it's a joke.

Yes, of course there is some magic macro, but I already declared the dependency, please just find it, I'm not here to put boilerplate into Find scripts.


In my experience, CMake seems to be more capable and easier to work with, both for the package author and for the person trying to build it. Also, more flexible in terms of how projects/packages depend on each other.

Now CMake is certainly not great; it has a lot of warts and baggage; and some functionality is _not_ easy to use, like installation (you need a good hour of explanation to make heads from tails with that, and then you're supposed to write 4 or 5 different CMake commands even for the simple use-case of "just install my damn plain-vanilla library"). But I'll choose it over meson any day of the week, and so would my users (i.e. the people who download and build my packages).


As someone who usually builds things I disagree. I think autotools is easiest followed by Meson and with the most annoying build system being CMake. On the other hand when I have to write build scripts I think Meson is best followed by CMake and then autotools. I feel Meson is a good compromise between maintainer needs and end user (the guy who build the project) needs.


Can you give any reasons for being "more capable and easier to work with?" I don't see how it is easier for users since as far as I am aware pkg-config files are an after-thought in CMake and options are all over the place (no equivalent of b_lto, default_libary, b_sanitize, etc.) so you have to learn about each project's non-standard equivalent options.

What is hard about the following for users?

    meson setup build <build options...>
    ninja -C build
    ninja -C build install


> What is hard about the following for users?

The ninja part is the same for CMake and for meson, but let's look at the meson command:

> meson setup build <build options...>

Except that when you try to specify build options, you will often fails, because some of them are "core options" and some of them are "base options" and some are "universal options" and their syntax is different. Sometimes. See, e.g.: https://stackoverflow.com/q/63549328/1593077


Oh you seem to talk as if the option/set CMake mess was not one hundred times messier.


It's only (somewhat) messy for the person writing the CMakeLists.txt, not for the person building. Which is what I was saying.


In Meson in none of the sides is messy... at least Idk what you find difficult or frustrating. Options are options and they are all typed... lists, booleans, numbers, strings, combos...


> (no equivalent of b_lto, default_libary, b_sanitize, etc.) so you have to learn about each project's non-standard equivalent options.

regarding this, I developped a wrapper for CMake a few years ago and never looked back: https://github.com/jcelerier/cninja


I really like automake and autoconf it’s not like I don’t know the shortcomings but with a routine (and good templates) it just works and works and works


Autotools is the most portable build system... if you ignore the most popular desktop OS in the world.


I used to like autotools a lot and even contributed to them, but honestly since I started using Meson there has been no going back.

It's not perfect and I do have three-four pet peeves that I hate about Meson, but they are more than offset by the overall ease of use and a feature set that matches pretty much what I need from a build system.


1. How well is CUDA supported there, for example?

2. What about when your users are on Windows?

3. How easy is it to combine autotools projects?


1. dunno never used CUDA. Pretty sure you can hack something together but I am the first to admit that it will be a bumpy ride if you start from 0 2. depends on what you are building. My users run WSL so essentially Linux user space. 3. are you asking for interdependencies? It works in principle I would say, for an elaborate foolproof setup I would anyway use bazel or buck in such a situation.


> My users run WSL

Ok, what if they weren't? We don't all get to make such calls for our users. Especially if we're writing libraries, which should be usable in Windows software development as well. This is a significant consideration for me in preferring CMake over autotools.

> are you asking for interdependencies? It works in principle I would say, for an elaborate foolproof setup I would anyway use bazel or buck in such a situation.

Ok, so - while I'm certain that in many cases, autotools is fine, your answers clarify how CMake is a better "swiss army knife". It caters to multiple modes of composition and inter-dependency: build-install-then-find_packages; download dependencies and build during dependent project configuration step; download dependencies and build during dependent project build step; manage dependencies via packager manager (e.g. conan and probably vcpkg); etc.

Now, some of these require a bit of care to get right; but they're all tried-and-true. And again, this will work on both Unix-like and Windows-like platforms.


There is a bizarre subset of programmers that confuse having to manually specify input flags based on individually targeted compiler with a "more capable" build system. Fuck no, it's the opposite of why I wanted a build system in the first place.


1. I'm not part of that bizarre subset of programmers...

2. CMake generates build systems (or build system scripts), it's not a build system in itself. After this generation (called the CMake configure phase), you use the generated build script with the build system of choice (ninja, GNU Make, vcbuild, etc.)


I'm using meson, since I started a new job a few months ago and so far I find it to be absolutely delightful, especially compared to cmake.

There is almost always only one certain way to do something, which makes it easy to learn, but sometimes frustrating to use, if you're not doing it the "meson way".

I also like the .wrap file mechanism to manage dependencies. It's simple, yet powerful enough for our small team and so far we didn't have the need to use a more complex solution.


Every time I have to deal with meson based projects, it needs incredible amount of hacks to build a project if you want to build it in slightly customized way or for bit more different platform. Meson is too smart and often tells you "no you can't do that", which can't be overridden without patching the meson's python code itself.

I also find it bit sad, we need python to build something in the first place.


I'm using shell scripts for my small C projects. Works awesome. I dread the time when I'd need to deal with those overcomplicated build systems.


Here is all that you need for a small C project, say an executable.

    project('hello', 'c')
    executable('hello', 'hello.c')
Obviously use what you are comfortable with, but I don't see how this is overly complicated.


Now add something like clang with custom wasm flags there. Or Tigress for obfuscation. Launch node.js app for testing after build. Few obvious lines in shell script. I guess few weeks of learning the tools with other approaches.

My unprofessional approach to C so far was to avoid libraries, even most standard library and write very simple C code (avoid C++ as well). Simple C code compiles at tremendous speed. Something like 100k lines per second or even faster. Especially if you don't need optimization (for development cycles). So you don't need to think about incremental compilation and things like that and it makes everything dumb simple.


I guess, the point was, you do not actually need this for a small project, and for a larger project it would look dreadfully complex.


If you are the only contributor, target one compiler, and one platform, then sure.

> for a larger project it would look dreadfully complex

How so? How do you avoid complexity?


Why do many C/C++ projects have both Meson and Ninja commands in their install instructions? Shouldn't only one be necessary?


Meson looks at the system (finds compilers and dependencies, mostly) and prepares a build directory with a build.ninja file in it (and possibly other files).

Once the build directory is ready you compile with ninja, there is a wrapper "meson compile" but it's kinda pointless because it's longer and doesn't provide any extra functionality.


Meson generates Ninja files. At first, you were supposed to run ninja after Meson yourself but modern Meson support meson compile which calls ninja under the hood


Yup, and Wikipedia says it very clearly too. Thanks!

> In contrast to Make, Ninja lacks features such as string manipulation, as Ninja build files are not meant to be written by hand. Instead, a "build generator" should be used to generate Ninja build files. Gyp, CMake, Meson, and gn[9] are popular build management software tools which support creating build files for Ninja.


meson compile will invoke the backend used.


How about Scons? I feel, Scons is much simpler even if it less efficient in speed of compilation. Please comment.


It's really a blessing that this project exists and was so well received and replaced Autotools in many areas already. That said, it has some learning curve for special cases (NB: the fact that it depends on Python may also need some workarounds in bootstrapping environments by e.g. compiling it first to native code or whatever but normally it's not a big deal as Python is around almost everywhere).


Ok, I wasn't aware of an alternative reimplementation in C: https://git.sr.ht/~lattis/muon/tree/master - bootstrapping without Python shouldn't be a problem then.


Are there any stories/articles on why people chose meson, over say CMake (or others)? I do like meson much more than CMake for sure, sane syntax and all.


For me a little disadvantage of Meson is that it is implemented in Python. With my interest in minimal systems and bootstrapping I would hope that it would get to be reimplemented in something easier to bootstrap or (maybe preferable) Python environment/version problems would get solved before the wider adoption. I understand, that it may not be as much a problem for others.


For what it is worth, there is Muon, a third-party implementation of Meson written in C99 [1]. Haven't used it myself yet, though Muon has been in steady development over the last few years, and the developers claim that they implement the vast majority of the Meson core features.

[1] https://sr.ht/~lattis/muon/


I was going to try replacing the KeePassXC build (which is one of the more complicated CMake builds I've had hand-to-hand with) to see what that experience would be like, but they report all the Qt flavors are unsupported: https://muon.build/releases/edge/docs/status.html

I'm aware I could just vanilla Meson but I was specifically interested in kicking the tires on the dbg in Muon since CLion just recently added debugging to CMake and I wanted to compare and contrast


Bootstrapping is something I care about a lot, too, and keeps me on GNU Autotools. Have you found any other options?


Muon[1] is a reimplementation of Meson in C. You should really read the blog post that you are commenting on.

1: https://sr.ht/~lattis/muon/


> muon analyze - a static analyzer for meson.build files. Capable of doing type inference, checking unused variables, undeclared variables, etc.

Whoa, that's fancy! That whole project seems neat, thanks for pointing it out


It even has a debugger and an auto formatter. Meson is getting an auto formatter in https://github.com/mesonbuild/meson/pull/10971.


> type inference

This means muon can fairly easily implement user-defined functions without making the language Turing complete. If they did this I would happily (re)consider switching from CMake.


You should read the blog post you are commenting on.


What do you mean? It is not a blog post, those are release notes, that do not seem to me to be relevant to my comment.


I'll say that I use Meson primarily due to the syntax. It's significantly easier to read, which leads to easier maintainability. For example, we started a new project and I wrote the build scripts in Meson. Then a team member had to come in an extend it and they were able to add onto the build with basically no assistance. I've never had this experience with CMake.

I also really like wrapDB. It's made it easy to adopt some common libraries, like fmt.

I'll say though, this projects I've worked on that use mesonDB were relatively small and encroaching on medium size projects. For large projects with complicated builds, I would've chosen something like bazel (and it's what our company has done).


In Medium I posted some years back this, it is four articles: https://www.google.com/url?q=https://germandiagogomez.medium...


Thank you!


htttps://germandiagogomez.medium.com/getting-started-with-meson-build-system-and-c-83270f444bee&sa=U&ved=2ahUKEwi_5vaV7JH8AhWIF4gKHaKxBtEQFnoECAYQAg&usg=AOvVaw2jKPxwkTlvc-jrUFJ6bX5W


Thanks! I'll check it out!



I have no knowledge of CMake other than that of a very superficial user. Still I think due to critical mass unless we have something truly ingenuous it is probably better to keep adding missing features to CMake than create other tools from a scratch.


I agree. I looked into Meson back when I had to use CMake and it is definitely an order of magnitude saner... But it's still basically the same as CMake and it lacks the enormous mindshare or CMake.

If I was going to use something that wasn't CMake I would use something that fixed lots of problems - Bazel or its ilk - not something that was just CMake but less fugly.


You are free to do that, but I am going to continue using Meson since it is so much better than CMake will ever be. The syntax is actually readable and familiar.


Played with Meson for one hour and bailed out, will stay with makefile/cmake.

I especially like cmake that can convert pkg-config from -I header.h to -isystem header.h.


Like "dependency(..., include_type: 'system')"?


thanks, 1 hour was obviously enough to figure this out.

time is just tight and it's hard for new things like meson or the rest of 20 different build systems to stand out, unless they're really simple/quick to take on I guess.


To be honest I just googled "Meson dependency isystem" and any of the first few results would have given the answer (I used the third which was a bug report).

That said, using isystem for dependencies is a relatively advanced concept, not something I would do in the first hour of trying Meson.


I did not find it with a search prompt of "Meson dependency isystem". That said, searching for "isystem" on the meson documentation does yield only 3 results. One of them is including the include_type field. I don't know if the explanation given there is sufficient to know what to do (if you don't know you're looking at the solution, that is).

I found a link to a github issue that settles the matter rather quickly further down the list, but I don't consider having to resort to issues or bug-reports to find a solution a good solution.


If I had the option between going back in time and killing Hitler, or going back in time and preventing CMake from ever being written, I'd pick the latter every single time[1].

Same with Bluetooth, Qt and every other piece of janky garbage that works just well enough to survive, but badly enough that every single person on the planet hates it.

[1] This is obviously hyperbole, before anyone gets "HN literal" in the replies.


What’s bad about Qt (besides not being able to write programs for it in anything besides C++ and to a degree python)


Qt specifically is bad because it's full of quirks and footguns. The licencing and distribution is also a bit of a pain.

On paper, it's great because it does everything and is cross-platform. In practice, though, including it with your product will introduce dozens upon dozens of completely preventable bugs and all kinds of bizarreness that will suck you away from solving the actual problems necessary to releasing your app.


Having a major project in Qt 15 years ago under my belt I can't disagree more. The project started with Qt 2 and at some point migrated to to Qt 3 in half a day. KDE is written in Qt and frankly if it was not a good option that experiment wouldn't have succeeded so far given that the amount of commercial support is a fraction the one GNOME receives. (Yea I know - citation required).

The quirks in Qt are coming form the fact that was trying and still tries to implement a sane C++ dialect and build system (qmake) in a time where the standardization efforts of C++ and event STL was not at the same level of today. For example So now we Boost for signal and slots. Do we still need moc? The same for collections, maps etc. When they go introduced there was no standard / portable solution.

You can probably describe Qt like the C++ equivalent of the java promise (write once - compile everywhere) with its own support for sockets, filesystem and what else abstraction you can imagine off.

Qt was an impressive feat for its time and a joy to program for. Much better than what the web was at the time with is amalgam of languages and text only interfaces. It took web and flexbox up to 2015 to be able to implement the holy grail layout https://en.wikipedia.org/wiki/Holy_grail_(web_design) .


Any time I've needed something other than make it's been a sign I need to refactor


You never really need something other than make, but I’d consider it borderline professional malpractice to try and wrangle medium-to-large size projects with make.


IDK all of the BSDs seem to be OK


I hate the world today. Every program needs its own build system. /s


Something that I learned from listening to Jonathan Blow about his new language Jai is that the implicit assumptions on which traditional C/C++ toolchains are built are horrendously out-of-date.

The whole concept of "discovering" which compiler you have, what your linker is, and then using a DSL to generate a script that triggers shell commands is more than a little bonkers if you stand back and look at it objectively.

It's common to see SDKs or APIs with 2-10 functions in them that weigh in at over 100 KB due to the overheads of having to deal with toolchains, platforms, and other sideshows that have no real contribution to the business at hand.

Jonathan has demonstrated recompiling, linking, and reloading an entire game engine live, which then lets you escape the tarpit of having to add non-compiled DSLs to everything as a workaround for slow compile times. If your compile times are measured in hundreds of milliseconds, scripting isn't needed any more! No more bytecodes, or JIT, or any of that. Just churn the code through a single process that outputs an executable directly and you can sidestep the need for entire industries.

This is why tools like CMake or Meson feel to me like they're solving the wrong problem. It's like selling a better horse when automobiles have been demonstrated by Henry Ford.


We've had fast dynamic recompilation since the 1970s. The problem that build systems solve is integrating multiple tools and software systems from across languages, runtimes, even operating systems and baking it down into a small set of commands for a programmer to invoke.


Jai doesn't use incremental compilation. It compiles the entire program fast enough that there is no need to bother with caching!

Languages that don't carry with them baggage from the 1960s can be compiled at millions of lines per second.

The issue lies in everything else surrounding the compilation process.

On top of that, C and C++ are horrifically inefficient because of the way macros and headers work, resulting in a type of "write amplification" where every kilobyte of code written by a human might take a hundred kilobytes or more of code that has to go through a compiler.


Okay, but I’m not in the Jai beta, and I have to use the tools available to me…


Nim, D, Go, Rust, and several other languages in this space all offer a similarly integrated toolchain without the drawbacks of legacy build systems. And those are all free and available, unlike Jai.


This is mostly true, but it is also because there is only one implementation of an these languages.

Go for example has a gcc compiler as well. And that compiler has very different options than the one written in go.


That's not really true, I'm afraid.

Go/Rust have alternative compilers as you point out; what does "very different options" have to do with anything? The GGP's statement about "discovering" capabilities is, at least to my interpretation, about the behavior of language features under a given compiler (i.e. autoconf rules for things like "are we using this one buggy version of gcc that can't reliably use vectorized instructions for some common task? If so, drop in an alternative implementation of some common library function"), not the options/flags of the compiler itself.

D has multiple supported compilers (https://dlang.org/download.html). The simplest one is simpler/more pleasant to use because of the lack of legacy baggage.

Nim is a bit of an odd one, in that there's one implementation of the language, but the most common compiler mode emits C code to be compiled on the toolchain of your choice, though the behavioral configuration of the toolchain is handled/defaulted well (at least, well enough I've never had to worry about a multi-stage build system) by the "nim" CLI utility.


> The whole concept of "discovering" which compiler you have, what your linker is, and then using a DSL to generate a script that triggers shell commands is more than a little bonkers if you stand back and look at it objectively.

This concept is still the only way to go if you port your application on a different platform and OSes. In the past, we had way more unix variations with varying compile vendors and funky CPU architectures. Even today, good luck with making your app compile seemingly across BSDs, various Linux distros (modern and old ones), and Windows.

I'll assume JBlow is doing all his development on and for Windows, so IMHO, using anything over simple "make" will be overkill - you already know the platform, you cook the binaries and you already know where your application will be run. What saddens me is the "modern" attitude in development: "if you don't have meson X or cmake Y, fu* off, you don't have to compile my program" vs. was done in the past: "I'll be happy you compile my program on your platform and configure script will make sure to adapt it. But drop me config.log if something happens. You don't need anything except a shell."


So you assume that in a game company all have the same skills to the more or less C++ level? What about designers that can program a dynamic lang like Lua or so?

No... the problem is not the compile times only or the reloading... it is the user profiles of who are doing scripting, who usually do not have a CS or solid programming background as such.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: