Hacker News new | past | comments | ask | show | jobs | submit login
Ray Tracing in pure CMake (64.github.io)
292 points by networked on Jan 9, 2021 | hide | past | favorite | 166 comments



Oh no... so far the most terrifying title I've seen in HN this year. By the way, it's impressive how they even managed to multithread this.


Someone might be working on an article even worse: list of things that raytracers haven't been written in (yet, regularly updated)



Or how about: an LLVM backend for CMake.


Calm down Satan.


You calm down, we haven't even brought in WebAssembly yet!


Next thing you know someone will suggest HN with hierarchy for repeated and unrelenting topics. Linux ports on home appliances, ray tracing, vi vs. emacs, launchd vs systemd, walled gardens, .... Oh no, say it isn't so!


Those darn Hackers on hackernews! :)


Hackers works on things which are not trivial.so certainly not a hacker


Darn those non-hackers who don’t work on non-trivial things! :)


those are called software engineers :)


(PS: would that eventually count as "raytracers, written in provocative blog post meme"?)


To pick from a recent HN thread: A raytracer has probably not been written yet using only printf. Who wants the glory?


Maybe that'll be my next project. Or instead someone should just write a compiler which outputs a printf format string, perhaps?


There are even Excel ray tracers out there


In fact that’s not even difficult, given how both a raytracer and a spreadsheet are basically functions that assign a value to every cell of a grid.


Well, if you can store data in icmp packets, I bet you can do math, too.

https://github.com/yarrick/pingfs


When someone manages to write a raytracer in COBOL I will be truly impressed.


Nothing magic there. It's a full programming language.

Especially if output is in a text based image format like PPM.

Since I don't know COBOL at all, I'm not the one to do that.


CMake is by far the worst language I have ever worked in.


Oh thank god, this thread is making me feel much better about myself.

I've been trying to integrate OpenCV's C++ library with my own C++ code (not involving Python), with the additional contrib modules (meaning I have to compile OpenCV myself), for multiple platforms, and I got stuck on CMake for so long it made me question if I was just a total idiot and should give up on programming forever.


to give another data point: I was given a project a couple weeks ago that was using visual studio solutions and opencv + contrib modules - I fighted 3 hours with making the .vcxprojs work on my machine before porting the whole shit to cmake in 15 minutes.

Using OpenCV was just a matter of

    find_package(OpenCV 3.3.1 REQUIRED)
    target_include_directories(theApp PUBLIC ${OpenCV_INCLUDE_DIRS})    
    target_link_libraries(theApp PUBLIC ${OpenCV_LIBS})
and everything worked, so I'm curious about what failed for you


Thanks for your reply.

Will this include the contrib modules even for Android? And iOS?

Once I sat down and read the Cmake basics, yes, it was easy enough to get OpenCV working with basicallly the code you posted, except I couldn't find a way to include the contrib modules in an Android version, it would always complain Aruco wasn't there.

Until I went and compiled OpenCV myself, and that was the part that was a massive pain in my ass - CMake failing to include architecture information in the library files unless you specify it on the command line before running Cmake GUI, even tho it is an option in the GUI. Tho perhaps that's more of a failing of the OpenCV build script than Cmake itself?

I did just recompile OpenCV myself, so I did get there in the end and it's all working now, but I'm sure I'll be tweaking all parts of this project for months to come so if there's an easier/proper way, I'm happy to learn.

It also needs to be statically linked.

Note: I'm not a real programmer, I don't know enough to have even been able to get an entry level job in the industry yet, I'm an idiot that doesn't belong on this website. I'm just trying to learn as I go.


> Tho perhaps that's more of a failing of the OpenCV build script than Cmake itself?

That is exactly what that is. Cmake is extremely powerful, and it lets library authors make build configuration that is quite often too flexible. Combine that with no standardized package manager for c or c++ for decades, and much cmake evolution over that time, means you encounter many different ways to solve problems.

Opencv is up there for most complex cmake projects. Only behemoths like VTK, ITK, and ROS beat it in complexity of the build.

Also, don't beat yourself up too much. :)


Humility is good, but it's important to acknowledge that CMake very often feels awful. I consider myself a competent developer and it's still pulling teeth trying to get what I want out of it.

Some things will get easier with time, and a lot of other things are going to suck shit forever. Keep on truckin'!


> Humility is good, but it's important to acknowledge that CMake very often feels awful.

Those times coincide with projects that fail to follow the "modern Cmake" style, based on build targets instead of pretending Cmake is a scripting language.

Keep in mind that modern Cmake style is a thing since the release of Cmake 3.0, which was what? A decade ago?

If developers who don't know better end up writing messy unmaintainable code, their code is an unmaintainable mess. That's not a result of their tech stack, but their own work.


Thanks!

To be honest I think alot of my frustration was with the OpenCV build scripts rather than Cmake itself, but I don't love Cmake either.


> Once I sat down and read the Cmake basics, yes, it was easy enough to get OpenCV working with basically the code you posted, except I couldn't find a way to include the contrib modules in an Android version, it would always complain Aruco wasn't there.

If this worked on desktop, I would assume this to be a bug in the OpenCV cmake files, there's no reason for aruco not to be in OpenCV_libs if my understanding of the OpenCV build system is correct (but then I did not really have to fight it)


You may be correct. The way I understand it, those extra libs are only there in the desktop versions, not mobile, but it's definitely possible I misunderstood.

Still, even tho it was frustrating and felt like a waste of time, learning to compile a big project like OpenCV taught me alot, and all or part of that will probably be useful in the future anyway :) even as a dot point on a resumé

It improved my shell scripting skills, and helped me understand how C++ links, and what library files actually contain etcetera

Whether cmake is good or bad, and whether it was necessary for my project or not, now I have successfully used Cmake to compile a big library!

I have to statically link it anyway, so compiling OpenCV separately and manually allows me to cut it down and make build changes etc that might be useful, as well as forces me to look at how it's structured and understand it better.

Thanks for your help!


plus 1 for this, I have been using CMake with my students for a year (moving from qmake) and this in conjunction with vcpkg to install stuff has been a breeze. This book is great too https://crascit.com/professional-cmake/


Speaking as the author of the post, I completely agree! I got burned so many times by random shit that made no sense. Stringly typed languages are the absolute worst.


Agreed. I will never understand how a group looked at overly-complex Makefiles, determined it was hard to write complex build scripts, and concluded: "I should write a Makefile meta-language that can generate these." That would be like looking at a complex Java codebase and concluding that the best solution to simplify it would be to write a Java meta-language for generating your classes...


> I will never understand how a group looked at overly-complex Makefiles, determined it was hard to write complex build scripts, and concluded: "I should write a Makefile meta-language that can generate these."

If that's your personal experience then it seems you do not have any relevant experience at all developing and maintaining software projects, specially if they need to be cross-platform and cross-tech stack.

For example, it's clear to anyone with any basic experience that providing an abstraction over C++ compiler's so that you can ensure your C++ project can and does build successfully regardless of C++ standard, compiler make and model, and eve os and os distribution is extremely valuable.

Can you show me a Makefile that handles that?

With Cmake you get the exact same C++ project building flawlessly on macOS with Xcode and clang, windows with any version of MS Visual Studio, and of course any linux distribution.

You just download the project, run Cmake to generate a makefile, and build

How do you expect to even run a Makefile the same on all those combinations?

There are very good reasons why Cmake is the de facto standard.


I'll tell you exactly how: trying to maintain cross platform ITK builds on tons of bespoke HPC clusters and academic workstations using the existing build tools became a nightmare.

Writing makefiles is extremely repetitive and tedious and error prone: precisely the kind of thing to automate. Not just makefiles mind you, cmake is designed to automate all sorts of build parameters.


I bet you never tried to write large scale Makefiles across several OSes, even POSIX ones.


If you only ever generate makefiles with cmake, then you probably don't need cmake.


I think modern cmake mostly has the right ideas, but the overall implementation is poor, mainly due to legacy.


> I think modern cmake mostly has the right ideas, but the overall implementation is poor, mainly due to legacy.

I fail to see how cmake's take on build targets is "poor". Can you provide an example that helps explain your point of view?


I refuse to even look at this. I've seen ray tracing in a PDF before but this is too far.


Let me tell you one detail: since cmake doesn't have floats, they use fixed point arithmetic. LOOK AT IT. You know you want too!

  function(sqrt x res)
      div_by_2(${x} guess)

      foreach(counter RANGE 4)
          if(${guess} EQUAL 0)
              set("${res}" 0 PARENT_SCOPE)
              return()
          endif()

          div(${x} ${guess} tmp)
          add(${tmp} ${guess} tmp)
          div_by_2(${tmp} guess)
      endforeach()

      set("${res}" "${guess}" PARENT_SCOPE)
  endfunction()


IMO, this isn’t that impressive. Turns out that CMake has a scripting language that can be used to implement a ray tracer, and do that multi-threaded. The model being rendered also is hard-coded in the ray tracer.

I expected a monstrosity that had multiple targets for every pixel (“ray arriving at (x,y)”, “ray arriving at (x,y) after one bounce”, etc), with some magic to merge all the outputs of the last bounce into an image, say by generating a huge html table, with each cell having a different background color.


> The model being rendered also is hard-coded in the ray tracer.

This is true - but there's no reason why it couldn't be modified to read in a scene description. It's more a proof of concept than anything else.


Well well well, we finally found something CMake can actually do.

Please, everyone, stop using CMake. It doesn't do its one job.


The scripting language is bad, most cmake scripts are much more complex than they should be, but cmake does a lot of things under the hood that make multi- and cross-platform development easier (for example: being able to build with the Visual Studio compiler without running in the "Visual Studio Developer Command Prompt").

A lot of fancy new build tools don't even have proper Windows support.


I keep running into CMake based things not being portable, and being broken. And pretty much every single time cmake fails, it does not have ANY log AT ALL of what it did, and why it thinks libfoo is not there.

Take autotools. You get EXACTLY what it did, and you see why it failed. With CMake it just goes "you don't have X installed". But I do. It's right there. CMake refuses to say under what cmdline or whatever it tried, and what the error was. Just "X IS NOT THERE!".

And then it depends on some version of cmake being installed, which it may not be.

I don't do windows coding, but shouldn't autotools be more viable now that Windows ships with bash?

But even if not, CMake stuff as a rule is not even portable between different flavors of Unix, so why even use CMake if it's only going to work on Ubuntu more recent than 18.04 or whatever? It's not a silly suggestion. OpenBSD generally just has raw Makefiles.


Could you explain how cmake stuff isn't portable or broken between different flavors of Unix? I comfortably maintain CMake builds that work on linux (with centos, ubuntu and arch), macos and windows without much trouble.

If you mean cmake versions, I'm not sure what you expect, should cmake just freeze in time and stop adding features so someone gets to use cmake from 5 years ago?

Also, the bit about autotools on windows is silly. If it can't build native windows stuff, it's not at all viable for windows.

I also dislike cmake (particularly dependency management) and wish for something better, but I think our criticisms should be well-founded.


I've never seen an autotools- or make-based project which builds out of the box on Windows in cmd.exe. Telling people to install cygwin just to build a project simply isn't viable when the default compiler toolchain on Windows is Visual Studio. Also it's not about building projects alone, cmake is also a project file generator for IDEs (in my case that was the actual reason why I switched to cmake from my own project file generator for Visual Studio and Xcode).


Interesting. Why do they fail with autotools?


Autoconf requires GNU m4 at build time and POSIX sh at runtime. It also requires all the standard UNIX command-line tools you would run from the shell. Automake requires Perl at build time and Make (preferably GNU Make) at build time.

A stock Windows system doesn't have these tools. Cygwin and MinGW/MSYS provide these, but you're using stuff which is non-standard for the platform and which if you need to integrate with other libraries and tools, end up being incompatible.

If you want to use MSVC, the Autoconf/Automake support is poor. Generating output other than Makefiles is possible, but limited and quite the undertaking.

CMake supports all these other use cases out of the box. Which is why it gets used. It works on every platform, and with every compiler, build system and IDE of note.


I don't know, I guess each project fails in a different way (my experience was mostly with libcurl before it came with cmake support), but IME projects which come with a Makefile or use autoconf don't care about non-UNIX-y operating systems, while (again: IME) a CMakeLists.txt file is often a sign that the project will also build on Windows with MSVC.


> But even if not, CMake stuff as a rule is not even portable between different flavors of Unix,

Not sure what your experience has been but I've had zero problems with cmake portability on nixes.

Raw makefiles are so much worse from a maintainability point of view.

I find it kinda fascinating to be honest. Two devs come to totally opposite conclusions.

Also to get more logging try this: https://stackoverflow.com/a/22803821/3988037


Everyone's experience is different, but that's very different from my memory of autotools - a Turing-complete macro processor which generates several thousand lines of pre-modern shell script with embedded C source codes. When anything goes wrong, it's pretty much impossible to match it to the original macro definition. Just so that one can hypothetically build the project in an ancient UNIX system which is so old that its shell doesn't even have functions.


While I've not had that problem, I can see that it's nonobvious. But it's years between me looking at the actual compiled output. I look at assembly output more often, and most people never even do that.

What I mean though is that ./configure outputs a config.log that says exactly what was done. Exactly what command line was run and exactly what the error was.

I guess I'm confused why you're even looking at the compiled scripts. Would you not look at logfiles and stderr output before you start looking at assembly?

CMake doesn't. The logs are completely useless, even when I get a CMake expert to come and agree, yes that's useless.

With autotools you can see that it failed to build because it couldn't find library foo when building with "gcc blah blah blah", and you go "well yeah, you need -L/opt/foo/lib", so you just add that. (or pkg-config equiv).

CMake, on the other hand, seems to force you to look at CMake "source code" (CMakeLists.txt), which is a horrible mess. I'm not saying autotools isn't (yay, m4 :-( ), but the point is you don't have to, because it actually logs what it does.

So this is one of the many many reasons CMake sucks.


> What I mean though is that ./configure outputs a config.log that says exactly what was done. Exactly what command line was run and exactly what the error was.

... so does CMake ? you get log of the errors in CMakeFiles/CMakeError.log, and you can trace what happens line by line in a very verbose way with cmake --trace (or cmake --trace-expand if you want variables to be expanded)


I just spent some time searching for m4 opinions on HN and it's more like a love-hate than hate-hate.

It looks like if you dive head first into learning autotools, then m4 is going to look like nuisance.

But if you allocate time to learn m4 for what it is, instead of just something you have to put up with as part of your autotools adventure, then you'll hate m4.

m4 is the only (or one of a very few) well-known general-purpose language-agnostic macro processor I know of.


I don't hate m4. Could be worse, could be better.

I used to build websites using m4 back in the 90s. Back when static site generats were popular, before the latest slight resurgence.

To misquote Mitch Hedberg: People either love it or hate it, or they think it's ok.


correction: ... autotools adventure, then you might not hate m4.


Its difficult to match the origibal macro, but dead easy to match into the shell script.

If youre just trying to get something to build as a user, its actually quite easy to read the configure script and see why its failing. The accompanied config.log is also quite detailed.

Autotools are not the best, but i always prefer building autotools packages over cmake. Worst case i can modify the configure script directly.


I never managed to get CMake to work out-of-the box on Windows.

I always end up vendoring the dependencies, because it's too tedious to find the correct installed libraries in a cross-platform way.

autotools are great, but at the same time it's a monstrosity, and easy to misuse (you should never commit the configure script, because it's the autotools that are supposed to generate it according to the platform it's running on).

For C++ dev, I did take a look at buck[1], but it doesn't really support Windows platforms.

I wish we had something like cargo (IMHO, the tooling is one of the top reasons of Rust's success), in the meantime simple Makefiles do the job perfectly.

1 - https://buck.build/


You probably may want to use Bazel. Buck’s Windows support at Facebook was fine (most all the PC oculus stuff is built with Buck) but I don’t know what state the OSS version is in (maybe technically has support but the internal pieces that make it work well don’t have OSS equivalents that are decoupled from other FB-internal things).


> autotools are great, but at the same time it's a monstrosity, and easy to misuse (you should never commit the configure script, because it's the autotools that are supposed to generate it according to the platform it's running on).

The configure script is not generated according to the platform it is running on. The whole reason tbe configure script is such a monstrosity is because its supposed to be portable, its written in the lowest common denominator of shell. You are supposed to distribute it. Its also fine to check in if you want end users building directly from VC rather than from tarballs. Just dont modify it by hand.


From my experience, `autoreconf -i` is responsible for generating the configure script from a configure.ac file, so I assumed some platform-specific logic was happening.

I always wrapped it up in a autogen.sh script that I did commit. Also, end users generally prefer prebuilt binaries, if one wants to compile the software themselves, I expect him to have autotools installed, and mention it as a requirement in the README.


Autotools and the configure script come from a history of distributing software purely as source code instead of binaries. From the early GNU days.

https://www.gnu.org/prep/standards/standards.html#Managing-R...

The intention was that software would be released as source code "tarballs" and contain a configure script, written in the lowest common denominator scripting language to configure that source code to compile on the users system. Additionally by distributing the configure script itself instead of configure.ac, users tend to need fewer dependencies that they don't already have.

It's _less_ applicable now that most users get pre-built binaries from package managers and it just feel pretty antiquated overall. But yeah the intention is that configure is platform agnostic and prepares your source code tree to build on the current platform. Whereas autoconf/autoreconf is intended as a tool for the developer to make writing "configure" a lot easier.


I would say that best practice is to NOT commit the generated files, but DO include them in release tarballs. Because release tarballs are exactly what need to be portable to all your users.


> I keep running into CMake based things not being portable, and being broken. And pretty much every single time cmake fails, it does not have ANY log AT ALL of what it did, and why it thinks libfoo is not there.

Google "cmake --trace-expand"

Is that your main gripe with Cmake?


I agree that CMake is pretty awful, [0] but I'm not convinced that we'd be better off using its immature competitors, especially considering the fragmentation that would bring.

[0] https://news.ycombinator.com/item?id=24203172


There seems to be a kind of “network effect” here with CMake, considering how popular it is among C/C++ projects.

Meson is more or less equivalent in the functionality it provides. There is also Gradle, but it doesn’t seem to have gained much traction outside of Android development.

My personal favorite is Bazel (and others from the Blaze family of build systems). It uses a python-based language for describing build rules, an extensive documentation and developers that respond to issues. Among the downsides is the lack of proper dependency management which is critical for public, open source projects.

There is also Nix which takes it even further, but it requires much more involvement from the developer.


Waf is another interesting build system I've seen used in production. It's config is done with Python scripts, which is cool.


There's also Premake, [0][1] Buildout, Ninja, SCons, and no doubt there are others. qmake was recently deprecated.

It would take a lot to persuade me to move away from CMake and to use something 'non-standard' that few developers are familiar with:

* Excellent support for command-line builds on Unix

* Excellent support for Visual Studio

* Excellent cross-platform package-detection. (Acid test: Can it detect and link against OpenCL without me having to write my own cross-platform OpenCL-detection script?)

* Excellent documentation, including clear and consistent best practices and design patterns

* A sensible language, whether a scripting language or a declarative language. Must follow the principle of least astonishment and be relatively free of foot-guns. (CMake fails tragically on both counts.)

[0] https://premake.github.io/

[1] https://en.wikipedia.org/wiki/Premake


I can't speak for MSVC support, but to me CMake fails all other bullet points. especially cross-platform package detection.

I haven't seen anything except autotools get this right.


> especially cross-platform package detection

I disagree here. CMake has working FindXyz modules for most major libraries. To write one of those modules is an exercise in soul-destroying tedium, but there's a pretty impressive body of existing modules out there, many of them officially bundled with CMake.

When used correctly, both CMake and Autotools are capable of robust package-detection.

Regarding point 1: CMake works pretty well on Unix, but it's a pity there's a runtime dependency on the CMake package. Autotools is much better in that particular regard: just about any Unix system can run a configure script, as it's just a plain old shell script.

CMake certainly falls short on the final 2 points, as I rambled about at https://news.ycombinator.com/item?id=24203172

It's too late for me to edit my earlier comment, but here's another alternative to CMake for the list: Bazel.


> CMake has working FindXyz modules for most major libraries

Unless you have that library installed anywhere that is not /usr/include Then you have to just hope and pray that there's some magic incantation that will make it find the right one (especially if your system-installed version is the wrong version and you'd really like to use the newer version you installed to $HOME)


Right, but it's always like this. If you're on Windows and you install Boost to C:\random\directory, it seems fair that CMake will require you to specify that directory manually.

The alternative is to have a global database, like the Windows registry or like pkg-config (which I have to admit I don't know much about). Perhaps CMake could have better support for pkg-config, I don't know.


I used Waf before I tried CMake, and I found the latter much easier to understand. The CMake language is a mess, but the compilation model is simple.


> Meson is more or less equivalent in the functionality it provides.

but meson needs everything to be quoted which is frankly a gigantic PITA - like, look at that, there's more quoted stuff than anything else: https://mesonbuild.com/Generating-sources.html


I've not used it, but that wouldn't bother me too much. It's far preferable to a language like Bash where you're permitted to omit quotes, but consequently it's a minefield to write scripts that can robustly handle strings containing whitespace (e.g. arbitrary file paths). The Ada philosophy has it right: readability and correctness far outweigh writeability.


> The Ada philosophy has it right: readability and correctness far outweigh writeability.

Given the popular success of Ada I'm not sure I would call that "having it right" except in very specific circumstances


There are various reasons Ada didn't take over the world, I'm not convinced that its emphasis on readability over writeability is one of them. Far more time is spent on maintaining code than on initial development, and anyway, typing speed isn't the bottleneck for writing new code.

Code review? You're in the business of reading code. Fixing bugs? You're in the business of reading code. Modifying/extending/porting an existing codebase? You're in the business of reading code.

If you're writing trivial single-use 'throwaway' code that needs no review, then readability doesn't much matter, but it's an important factor for just about all serious codebases.

Of course, there's also the question of whether Ada's specific syntax really succeeds in being more readable. It makes extensive use of English words in ways that are sometimes awkward and unnatural. I'm not a fan of its and then / or else short-circuit operators, for instance, and think they should just have gone with something like C's && / || operators. Neither syntax is self-explanatory to someone who doesn't know the language, but at least C's syntax doesn't counterintuitively collide with English.


I'm increasingly convinced that a script in a real programming language is the way forward.

The D compiler used to have a bunch of fairly flaky makefiles (different make vendors => pain), but now there's a shebang script written in D that not only does it's job well but also handles args properly and to top all that off is actually readable by people who aren't used to building software on Linux


What we could really use is some new declarative syntax that interoperates with existing cmake projects.

There is already talk of such a system and even some implementations. I am sure many people have thought to themselves "This is so complicated! So much legacy cruft! I bet I could build a way cleaner c/c++ build system! " only to run into all of the complexity of cross platform c++.

I highly doubt there will be a cargo for c/c++, because it simply isn't rust.

https://gitlab.kitware.com/cmake/cmake/-/issues/19891

https://gist.github.com/stryku/4c69aa510711c9da6705fa4df4545...


I like declarative approaches, but there's such a long tail of weird things to support. I'm skeptical something declarative can do the job on its own.

Maybe some time after modules become available and then a known quantity.


The discussions mention this. Any such system would likely need to interoperate with existing building system, and serve as the leaf nodes wherever possible.


QBS was such an attempt to declarative building in C++ in a declarative way which looks sooo elegant on a first glance, and we all saw how well it did work out (it did not)


What was the issue with QBS? The syntax looks clean and concise. Seems like the main issue is lack of traction.


If you’re going to tell us what not to use then at least suggest an alternative.


autotools. You may not like it, but it does work.

If you like CMake more, that would be a good argument if CMake worked. But it doesn't. So in the choice between the thing that works, and the thing you may like more, please do go with the one that works.

E.g. this comment I just wrote: https://news.ycombinator.com/item?id=25702345


The problem is that more often than not, the autotools don't work.

They were the go-to tools for the portability problems of the mid-late 90s and early 2000s. Unfortunately, since then they have stagnated, and they are now stuck solving portability problems very few of us have. But they don't do much for contemporary portability issues, while CMake most certainly does.

Most recent problems I've encountered are that Autotools failed to build properly with MinGW and Cygwin. CMake worked perfectly.

Since the decline and obsolescence of proprietary UNIX platforms, the main purpose of the Autotools has been effectively removed. Today, the Autotools work on Linux, BSDs and MacOS, but with pretty much everything else having been long obsoleted, the amount of testing on other UNIX platforms is both minimal and completely pointless. Few people care. They are dead. But, they still don't support the most popular platforms of today: Windows, Android and iOS. But CMake does.


if CMake works for LLVM, Qt, KDE, ReactOS, OpenCV and all the other projects in vcpkg there's some chance it's working for whatever your project is


Yes, it could be that project after project after project using cmake is just "holding it wrong", and that's why it doesn't build on aarch64, openbsd, intel x32, MIPS, with some dependencies in nonstandard locations, etc… etc…

It could. But it's an incredible coincidence that it's always cmake projects.

Generally on nonstandard installs the others (e.g. scons) won't even run, so portability of the conf is not relevant.


> Yes, it could be that project after project after project using cmake is just "holding it wrong", and that's why it doesn't build on aarch64, openbsd, intel x32, MIPS

what are you talking about ? take for instance kcachegrind, it builds with cmake and works on every architecture that debian supports (https://packages.debian.org/en/sid/devel/kcachegrind). I also never had issues with x32 when I used it, nor on freebsd. Can't say for openbsd but I'd be surprised it does not work there - this seems to point that other than a minor path issue things just work: https://www.sizeofvoid.org/posts/2020-03-29-how-to-build-qt5....


Again, project after project that doesn't work under those settings.

I'm not saying CMake itself doesn't work. Though I have run into both "CMake version is too old!" and "CMake version is too new!", which by its nature can't happen with autotools.


It can happen with the autotools.

The reason it hasn't been seen recently is that there has been no significant development of note for the past 15 years.

Even so, it's still common for the distributed config.sub and config.guess to be outdated. That's the price you pay for embedding stuff in each package, as opposed to requiring the build tool to be provided on the build system just like all the other tools.


> It can happen with the autotools.

Yes. Again squared: project after project etcetera.


I have mixed feelings about CMake. Non trivial scripts look like an spaghetti ball, but it's an extremely powerful spaghetti ball.


From my perspective that's possibly a partially-good thing. Don't do non-trivial things if you can avoid it. Projects should stay on the path of consistent practices, follow standards of packaging, and try to follow conventions that the build system lays out. Stay with standard libraries, avoid custom tooling when possible, etc. If what you're doing requires a boatload of scripting in CMake, you need a good justification for it, and if that justification is there then writing the scripting for the build system is the least of your concerns.

But yes it really is ugly once you get off the well beaten track of "here's my source now make a library out of it with a C compiler and a few standard options." I have a personal project I fiddle with that glues together verilator, fusesoc, CMake, a RISC-V compiler toolchain, and some custom scripting into xilinix vivado.. and it terrifies me just a little.


I absolutely despise cmake. But autotools is even worse, so it’s what I end up using. I haven’t found another approach which handles my needs.


What are the needs


> Well well well, we finally found something CMake can actually do.

You mean, besides Cmake being the absolute best build system for C and C++ available, right?

> Please, everyone, stop using CMake. It doesn't do its one job.

What are you talking about? Can you provide a single example where you show Cmake not working?


I'll agree CMake is pretty rough to get into, but what makes you say it doesn't do it's one job?


It doesn't produce portable projects. If all you wanted was Linux (or often "Ubuntu compatible") then why not just have a raw Makefile?

It doesn't log what it attempted, or why it failed.

It can't find out how to link binaries (always awesome to set up som execsnoop (because no logs, remember) only to see that it tries to link with "-llibfoo". In what world could that be correct?. And again it doesn't log how and why it chose that)

I could go on, but it's too much for a comment field.


What's wrong with CMake?


The language is extremely error prone. It's like they looked at Bash and thought "hmm, too safe".

Everything is a string, even lists are just semicolon-separated lists. The argument expansions rules for calling functions are literally impossible to remember. All variables are implicitly defined as empty strings. Typos silently break things. `if` is completely broken. The separation of the configure step from the build step is needlessly confusing and makes some things impossible. This is just scratching the surface.

It does have a couple of redeeming qualities: they have a pretty great versioning mechanism which allows them to make breaking changes. Though inexplicably they haven't used it to fix any fundamental flaws, only little tweaks. And most importantly it has been around for ages so loads of the many many weird C++ compilation knobs have been solved.


>The separation of the configure step from the build step is needlessly confusing and makes some things impossible.

I'm confused. How do you propose CMake supports all those build generators without doing this?

(PS. I really don't think the syntax is that bad, aside that it is jarring for C programmers, most complaints are overblown and you get used to it once you learn the rules. It might be interesting if someone made a CMake variant with a less obscure syntax but I don't see much motivation for that to actually happen)


In my view cmake doesn't really have much of a syntax to be learned. Apart from the most basic variable- list- and function-handling.

Beyond that the difficulty lies in that calling any command lacks consistency and the output is just that some global variable without any convention were reassigned. There is no common semantics so you can not take the knowledge you learned from one operation and apply to the next one, everything is unique and non-composable. The solution to almost everything is just paste this random magic string anywhere in your cmakelist and it will work.


The syntax has improved slightly in recent releases with the target family of commands. You just have to know what legacy parts to avoid which is itself a burden. Still it beats digging into the bowels of autotools.

I do fantasize about a new Cmake frontend implemented with Tcl scripting. That would be far more consistent and flexible and you can disable much of the language by default to minimize footbullets then allow commands to be reenabled for times they're needed.


The target commands don't change the syntax at all. And your definition of "recently" is very loose! I've been doing CMake for 6 years and always used the target versions of commands.


I guess I just don't see where there are footguns and unsafety here. Build scripts are intended to operate on a known, fixed set of inputs.


There was some work ongoing to move to Lua internally and gradually move over to it, but it's a huge task and I've not kept up on recent progress with this.

This would of course be a huge improvement, but there are many details to get right to support the old and new scripts in parallel to effect a smooth transition. But in the long term, this will resolve many complaints about CMake and really improve the semantics of the language and development and debugging of scripts.


> How do you propose CMake supports all those build generators without doing this?

I propose that it doesn't support build generators. It should do the build itself, like QBS did.


This would remove one of the key selling points of CMake, the fact that it is "glue" to permit development with any build tool or IDE of choice.

It's great that one person can be using Visual Studio on Windows while another uses CLion on MacOS and another uses Eclipse or Emacs on Linux, all working on the same codebase. It's a huge benefit for cross-platform tooling-independent development.


> It's like they looked at Bash and thought "hmm, too safe".

One of my favorite comments in a while. :-)


The syntax is weird and inconsistent. Built ins are case insensitive, but user defined functions are sensitive. No way to return values from functions, except by jumping to the scope of the caller and changing its variables unexpectedly. (Better hope you know what the caller does)

That isn't a complete list .

Despite the above cmake is your best choice of build system for C. Though some competitors could overtake it.


My least favorite part is how completely impossible it is to discover all the implicit variables set by other cmake files, and there's no common convention, so some libraries are like ${ZLIB_INCLUDE_DIR} and some are #{zLib_INCLUDE_DIRS} and you have to just manually try a bunch of combinations to finally find what works because the scripts themselves use weird string based metaprogramming to build the variables themselves.


On top of that I found the documentation and the book terrible. I thought it might be easier to understand what it does by reading the source code (didn't try yet)


The documentation is "complete" in the way that some manpages are - the biggest problem with it is the lack of best practices guidance. About books, I believe that good ones exist now. The official CMake book is not among the good ones - last time I checked, it was ridiculously outdated (and IMO not even very good at the time it was written).

I have extracted some knowledge from the source code. It's more underdesigned than overdesigned and I think that's the better side to err on. Kinda awkward but you can figure it out.


Which cmake book is good?


Professional cmake seems to be the one recommended now. The author updates it regularly.


which competitors?

any thoughts on why are build tools so hard to get right?


Build2 is one that a coworker of mine likes. There are others.

What makes it hard is the problem is a lot deeper than most people realize, so most attempts are not powerful enough to be useful. They are clean until someone points out one of the many edge cases that were not thought of, and by the time you support even a few of them your clean design is a mess of uglyness.

Making it worse build systems are an afterthought to most develpers so you end up with a mess just because the programers are not taking care to write good clean code in the build system.


I still don't understand why cmake/autotools are seriously considered. If you have a tool that is hard to do complex things with (i.e., complex Makefile operations), the solution should _not_ be a tool to generate input for that tool (i.e., CMake/autotools generating makefiles). We shouldn't use a build system whose artifacts are scripts in a cruftier build system; we should replace the universal build system whole-cloth. Even the portability argument fails: if any modifications to the build system require installing the respective meta-tool (cmake/autotools), the generated Makefile is nearly-useless for anyone trying to modify that codebase in any substantial way. We need tools that do the actual building, not Makefile meta-languages.


Well, first let me say that I agree that neither cmake nor autotools is perfect.

I am saying that cmake is fundamentally broken and unfit for purpose even in addition to what you mention. I've mentioned some in another comment here, but really why cmake should be thrown in the garbage is too long a rant to fit into a comment field.

I am VERY interested in hearing new ideas. If you have a truly better way, then I want to use it.

Though I'm not sure why you say generating code is so bad. Code gets compiled to machine language, and (especially with optimizations and all the new fancy instructions) are not a thing that most people usually look at anymore.

Yes, actual developers need automake/autoconf installed. Of a sufficiently recent version. But what exactly is your suggestion? Autotools at least makes this an issue only for the developers (and not even all of them, since they may not need to change things), not for the orders of magnitude more users.

But this is not a unique thing to developers. There may be other generated code. E.g. they need to have protobuf compiler installed, with all the plugins required. End users, even those who build, don't. Developers who don't modify those parts don't.

But here's the main thing though: Any portable build system, that already requires EVERY user to have that build system installed, is DoA for being a viable portable build system.

It's incredibly frustrating to download a package, only to find that the build system the author in their infinite wisdom chose to use, has not been ported to your platform (or if it has, you have to yak shave for a few hours to get it and its dependencies installed). So you can't compile the thing.

The beauty of autotools is that it "compiles" to a "virtual machine" (shell) that will truly run anywhere. Nowadays even on Windows, which now ships bash.

CMake too fails on this aspect, in addition to all the other aspects this box won't be able to fit.

But please, I truly mean that if you have a better way, then I do want it. It's incredibly hard to build something portable though. Hence the minimal dependency of "just a POSIX shell" for autotools.


>Any portable build system, that already requires EVERY user to have that build system installed, is DoA for being a viable portable build system.

Why? I don't see how it's different from requiring a user to install a compiler to build a particular language. Operating systems don't have every compiler installed by default. To fulfill your requirements, one would have to re-implement every build system and compiler in POSIX shell. Honestly the "better way" that I see a lot of projects going with is to just support multiple options for build systems, because there is no one perfect solution that is going to work on all platforms.

The elephant in the room here is windows, and at this current point in time, CMake is about as portable there as autotools because Visual Studio has built in support for it.


> Why? I don't see how it's different from requiring a user to install a compiler to build a particular language.

The various build systems I've seen, including CMake, are portability projects in themselves. Yes, to build a C++ program you need a C++ compiler. But if it's using CMake then you need CMake. Oh, but you don't have that. Ok, now you have to install that.

Oh, it requires Python X.Y (I'm not saying CMake does, but I've seen others that do, and CMake has other similar things)? Oh, this system doesn't have that. So now I need to build Python from source. Does Python have any dependencies, perhaps? (or worse, you have to use backports, which pull in 1000 dependencies so that now you have a frankensystem)

If you've ever been at the point where two steps down the dependency chain you have to build something as fundamental as Python, then you probably know that this is not a fun experience. And in fact you may run out of disk space, several gigs of source and object code later. (not every system is a 100TB server)

The whole thing about portability is that it should actually work even on systems the original developer does not have access to. If all you need to support is Ubuntu 18.04 (or newer) and Windows, then why not just have a Makefile and a MSVC project file? That would be much easier than CMake or autotools.

Oh, and I've also been hit by dependencies of the build system being too NEW. E.g. CMakeList.txt using features removed in newer versions.

And that's how autotools is different. The ONLY dependency is a POSIX shell. Do you have that? Then you can build.

The number of times I've seen CMake try to link with "-llibfoo" (it's "-lfoo") or be entirely confused about how I'm not running the exact same system the developer is, I can't even count. (e.g. failing to build because "this is not an amd64 system"… uh, yes it is, with zero logs about why it thinks that)

> The elephant in the room here is windows, and at this current point in time, CMake is about as portable there as autotools because Visual Studio has built in support for it.

Well that just means CMake has nothing at all going for it, if they are as portable.


> But if it's using CMake then you need CMake. Oh, but you don't have that. Ok, now you have to install that.

you don't, CMake is able to bootstrap from nothing, it only requires a C++ compiler

> The whole thing about portability is that it should actually work even on systems you do not have access to. If all you need to support is Ubuntu 18.04 (or newer) and Windows, then why not just have a Makefile and a MSVC project file? That would be much easier than CMake or autotools.

having been in that exact place it's definitely not true


I've built cpython, it's about as inconvenient as building the compiler and runtime for any other major language. What OS ships binaries for gcc/clang but not python? I can't say I've seen any Linux distros that don't have it.

>Well that just means CMake has nothing at all going for it, if they are as portable.

Sorry, just to be clear, Visual Studio does not have built in support for autotools. (Unless they added it recently and I wasn't aware) But, there are a number of other less convenient ways to get that to work on Windows.


Usually distros are one or two versions behind what many developers consider to be minimum viable version. For python especially, but also compilers if you want bleeding edge c++20 support.

So then you need to either compile from src, or sideload install it and pray that cmake finds your new version instead of preferring the distro installed version.


I've built it too. That's not the point. It's the yak shaving of "I just want to compile this thing here" that turns into "what options do I need for building Python", and "how do I then get these third party Python modules with a custom install script to install into the correct Python path. And what exactly should I set PYTHONPATH and LD_LIBRARY_PATH to".

Keep in mind that I have to do all this just to build some C++ binary because someone decided to not use autotools.


Windows is not the only non-UNIX OS around.


> Any portable build system, that already requires EVERY user to have that build system installed, is DoA for being a viable portable build system.

I've voted this post down for this in particular. It is truly an absurd criticism, especially when you refute it yourself when you admit bash on Windows is a relatively new phenomenon (and I still struggle with bash env on Windows). If you use a build system, you need a build system installed, full stop. Cmake is pretty easy to bootstrap on a new host with minimal assumptions. Literally just `./bootstrap; make` on basically any 'nix.

Cmake is designed to run truly anywhere, with many backing generators, not just make. It doesn't even assume a posix shell iirc.

Sounds like many of your complaints are with the library authors not writing with portability in mind.


> Any portable build system, that already requires EVERY user to have that build system installed, is DoA for being a viable portable build system.

I don't understand the argument, it seems very circular.

Most software packages have a number of build dependencies as prerequisites for building, including a compiler, build tool like make/ninja, libraries and headers, other processors and tools like doxygen, sphinx, static analysis etc.

What makes the "build system generator" so unique in its requirements that this one tool must be embedded in the source package?

Nothing.

It's a convention adopted by the GNU Autotools initially for pragmatic reasons--the requirement that the end user install m4, perl, and a few other tools, and that in the early days the breakage between autoconf and automake minor versions was routine and really annoying. Today you can just run "autoreconf -fvi" for the same effect, and many packages have stopped distributing the generated scripts. Many distributions regenerate them as a matter of course, making the distributed copies redundant. Today, when we have fully-described and automatic installation of build-dependencies on most platforms, the need for embedding no longer exists.

I'm afraid that I don't buy your argument. Don't mistake historical conventions established for a single tool with a hard requirement. The requirement doesn't actually exist.


Simultaneously love and hate this


High-five from the opengl-bash guy!

http://www.nrdvana.net/cmdlinegl/


Thanks! This is extremely impressive, and looks like far more effort haha


Btw, one of the things I came up with in bash was an “object oriented” pattern (antipattern?) where a “constructor” function would declare variables for “attributes” and functions for “methods”, like for example declaring an object named Vector1 whose fields are Vector1_x Vector1_y Vector1_z and methods like Vector1_Normalize and Vector1_SetMagnitude. The functions would be eval’d for each instance of the object during the constructor to reference the attributes and methods of that named object instance. Is this possible in cmake? (I don’t know cmake at all)


It sounds like it should be possible (CMake also has eval), but the lack of OOP isn't so much a problem for what I needed. Packing and unpacking the vectors was more annoying. I can live without OOP, I've written enough C to get used to it haha.


Actually, thinking back, the primary reason I did that was that bash can’t dereference a variable by its name stored in another variable without an eval, and calling eval on each access of the variable was slow. By eval-ing the whole method ahead of time, I could access all the object attributes at runtime without any further evals. It looks like cmake has a getter/setter function that takes a name, so that wouldn’t be a problem here.


The OpenGL interpreter was a lot of effort, but the bash scripts that feed it were basically the same problem you worked out: how to implement 3d vector math in a language of integers, array variables, and functions without references or return values.


I must say that completing this article leaves me with a feeling not too dissimilar from that mixture of awe and disgust when I finished reading Uwa-Koi.


So far the funniest bug with cmake I've fixed was when we were accidentally setting a variable with name '0' to a value '1'. Later on, in other module, we were checking if some other variable was equal to '0' which was interpreted by cmake as a variable reference so the actual check was with '1'.


> Conclusion

> If you made it this far, thanks for reading! Feel free to create issues, send pull requests or star the code on GitHub.

Done, fixed 3 bugs there https://github.com/64/cmake-raytracer/pull/4 https://github.com/64/cmake-raytracer/pull/5 https://github.com/64/cmake-raytracer/pull/6 . Do I deserve a medal? A punishment for feeding an abomination?


Thanks so much! I'll take a look at these when I get a chance


Thanks for merging all my PRs. :-)


Here's a performance figure on my i5-3210m (2 core, 4 threads) when using 4 processes:

* 64x64: 14.561s

* 256x256: 226.28s, 15x compared to 64x64.


> * 256x256: 226.28s, 15x compared to 64x64.

Basic ray tracers compute one ray for each pixel (without reflection/refraction), so that this increase in time is expected, as it's roughly 256²/64²=4². This is also due to the fact that each ray is independently computed.

I don't know if sophisticated ones can save some (rays) computation by interpolating.


I remember writing a ray tracing in 1994 in Turbo Pascal 7.0 running on 286 under DOS 4.0 at the time. I did it with floating numbers, pure mathematical implementation. When it came to usability was something like 1 frame every 30 seconds for resolution of 640x480 with 256 colors only.

The problem with ray tracing isn't writing one, the problem is its usability, to get the most realistic AND fast enough. Mine was very realistic for that resolution but the snail FPS was something unbearable. Despite my best efforts at the time, hardware was not ready. I remember the talk between us (prof. assistant, basically a master student and rest of wanna be programmers) and we're so sure it would take maximum 5 years for ray tracing and photo-realism to become every day commodity. We were wrong only by 20 years. Good times.


> we're so sure it would take maximum 5 years for ray tracing and photo-realism to become every day commodity. We were wrong only by 20 years. Good times.

Yeah that was a very long running joke about ray tracing: “real-time ray tracing is only five years away... and always will be”. ;)

> The problem with ray tracing isn’t writing one, the problem is its usability, to get the most realistic AND fast enough.

Times sure have changed eh? 640x480/30s is ~ 10k rays/sec. Now we have ray tracing GPUs that can do well over 10B rays/sec with full floating point. More than a million times faster, not even counting precision & scene size, crazy right?! I’m working in the field and still blown away by what people are doing in real time in the last few years, fully path traced scenes with global illumination that are hard to distinguish from photographs...


640x480 256 colours on a 286? With only 30 seconds for an image, even though it doesn't have a floating point unit? Sorry for being a doubting Thomas, but that seems very implausible (having spent my life writing ray tracers).


This is an ode to how much you can do with the four arithmetic operations and not much more


Every day we stray further from god’s light


This is cool, someone should port this to bash, which also only supports integers.

Then I can use it as a benchmark for https://www.oilshell.org :)

Also bash can do the multicore part very easily with processes, like cmake.


Feel free to use my bash 3d geometry library

https://github.com/nrdvana/CmdlineGL/blob/master/share/lib-b...


> This is cool, someone should port this to bash, which also only supports integers.

With a little bitfield support (you could always call out to dc but you can do boolean arithmetic directly in bash) you can simply implement IEEE 754 in bash.


Yeah bash has 64 bit signed integers and bitwise ops on them. I'd be interested in seeing a floating point implementation. I think it would be pretty much identical to an implementation in C with signed long.

    echo $(( 2**63 | 1 ))
    -9223372036854775807


I think this is a demonstration why CMake is bad.

Unless it's a general purpose programming language, being Turing complete should always be listed in the "Con" side. It literally means the effort needed to understand its complexity can be unbounded.


It's very hard to avoid turing completeness, and it's normally not particularly valuable to avoid it unless you have some extremely strong requirement for formal reasoning about your code. I've not found that difficulty of understanding correlates well with turing completeness in practice (see regular expressions for an example of a non-turing complete system which are famously difficult to understand).

And in the case of a build system specifically, generally if you don't give some generally programmable interface to it then another wrapper layer will appear over it which does. Decisions in build systems are often enough complex enough to warrant a configuration language capable of expressing many different things. (This doesn't necessarily need to be an integral part of the build system: for example rust's package manager/build system, cargo, has a fairly simple non-turing complete configuration file but can run a build.rs program which can through a simple protocol of printing out values adjust many of the decisions cargo makes).

In fact, I'd say that CMake suffers most from not embracing that from the start. It's scripting language has clearly grown organically in a messy fashion from a configuration language, instead of being designed from the start with scripting in mind (at which point it may have just been a better idea to use an existing language, like lua, tcl, or python).


Cmake isn't the problem, it's a solution to the complexity of building huge c and/or c++ projects such as VTK such that you can run on many different OSes, arches and platforms, with hundreds of different build options.

That is just what you get with legacy build systems.

Disclosure: I work at Kitware.


I think of CMake as a coin with two faces: on one side is a cool engine that manages dependencies and can somehow wrangle Make/xcodebuild/ninja et al. On the other side is a dreadful interface (presumably due to legacy constraints) that you have to gingerly manipulate in the hope that it will cause the proper state changes on the other side so that it can generate the result you want...when I don't actually know what the internals of say Xcodebuild are.

It's sort of like using Italian to correspond with someone when you know Latin but not Italian.

If the code base really looks like that it might be interesting to implement a different front end. Even XML would be better.

On the other hand I couldn't imagine working without it and would be happy if it supported more languages (I only use it with C and C++).


I think Skylark (the language for Buck/Basel) represents an interesting compromise here that I like better, even though in the past I was a huge Cmake advocate.

The majority of rules are declarative and easily understood. Python (or at least a dialect extremely close to Python) is how you extend and write new rules. The two live in very different files (declarative rules are in BUCK/BUILD files, .bzl files contain the imperative definitions of rules, and various other conventions control how you deal with platforms (ie select rules).


Wow, the implementation doesn't even look that horrible! Fascinating, crazy project.


Please clarify for me the part about the set exec command. You create n workers but you pass the whole image size to each. Why isn’t the image size divided by the number of workers?


See the source code, it uses worker_index to find which part of the image it needs to process.


It figures out which rows to render using its own worker index


They were so preoccupied with whether or not they could, they didn't stop to think if they should.


Funny comment from somebody named turing_complete, innit? The fact that they "shouldn't" is what makes this abomination fun.


Can it run crysis?


very good porno




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: