I've used autotools for >15 years now. My $0.02 is the following:
- autoconf is horrible but useful. It's bloated, slow, obtuse, and hard to use. But it encapsulates a portability layer which was useful 15 years ago... and less so today
- automake is horrible. A few simple rules in GNU Makefile syntax are good enough for the majority of projects. And it's easier to understand than automake
- libtool is horrible. It slows down every build by a factor of ~10. And you'd think that a program dedicated to building C programs would be written in C. You know, for speed. But no... it's written in shell. Horrible, horrible, shell. And it breaks the build. Pass "-L/foo -lbar"? It randomly converts it to an absolute path "/foo/bar.so". Pass in an absolute path so that you can link against the LOCAL build directory for testing? Screw you... it re-writes that to a relative path, and links against the system libraries.
- libltdl should be shot. I got rid of it in my projects 5 years ago, and never looked back. Everything sane has dlopen() these days.
If you want a nice cross-platform replacement for these tools, look at what nginx has done. ~8K lines of code, largely Makefiles split into ~50 lines. Simple, clean, and well organized.
I'd like to jump in and endorse CMake, too, after spending some frustrated nights some years ago trying to add autogoo to a project.
It was so mindblowingly easy to get CMake to work--to detect/find paths for libraries, work on Linux and Windows, the whole nine yards--on a project I had, I had the basic cross-platform functionality up in a matter of hours on my very first attempt.
And the makefiles it produces generate pretty colored output. :-)
Autogoo knowledge seems like it's about on par with a deep knowledge of Xlib--good for anyone maintaining a legacy project, I'm sure.
I've worked on projects using autotools, and worked on projects using CMake. While CMake does add Windows support without having to use MSYS or similar, I've found the syntax and language semantics of CMake incredibly painful to work with.
CMake feels very "stringly typed", worse so than shell. For instance, the definition of if conditions: 'True if the constant is 1, ON, YES, TRUE, Y, or a non-zero number. False if the constant is 0, OFF, NO, FALSE, N, IGNORE, NOTFOUND, the empty string, or ends in the suffix -NOTFOUND. Named boolean constants are case-insensitive. If the argument is not one of these constants, it is treated as a variable.'
CMake has variables, and something vaguely like dictionaries, but the syntax again seems very string-like, with a SET() function and the <name> PROPERTY <name> syntax.
And on the flip side, CMake retains many of the quoting and argument splitting problems from shell scripting, without the well-established solutions for those.
In general, CMake seems like an exercise in minimal syntax: "can we build a language with no operators or symbols?". The result feels wrong in the same way COBOL does.
If you can treat CMake as a completely declarative language, it more or less works. However, the moment you have to actually use it as a programming language, it becomes painful.
I'd rather have shell scripts than CMake, and that says a lot.
That's the thing, if (a specific distribution of) Linux and Windows are all you care about, then CMake does the job just fine. Autotools solves the problem of being portable to every conceivable Unix, including many Unices people don't use anymore, and including Unixified Windows.
Also, autotools solves the problem in a very specific way, test for features, not for versions:
> Autotools solves the problem of being portable to every conceivable Unix, including many Unices people don't use anymore, and including Unixified Windows.
"Portable to ultrix and dg/ux" is not much of a selling point.
That sounds like what I said--portability to every conceivable unix, including ones that people no longer use--that autotools is good for legacy support.
I'm trying to see what benefit there it to start a project in 2016 and wrestle with autotools, just so that you also gain access to unixes no one uses.
CMake seems more difficult for the person building the software.
I'm familiar with autoconf's configure options like --prefix and variables like CXXFLAGS, and if not I can see a reasonably short listing and explanation of them with `./configure --help`, and that list often includes other --enable-foo like options the project defined.
CMake, on the other hand, does not (at least for the projects I've run into?) honor --prefix or CXXFLAGS as arguments to cmake, and cmake --help doesn't show what variables it does honor. Through googling and cargo-culting I've come up with some more cryptic replacements - `-DCMAKE_INSTALL_PREFIX:PATH=$PREFIX`, `-DCMAKE_CXX_FLAGS="-lrt"`.
Now that I look at the `cmake --help` output again, I notice that in the middle of its long list of unhelpful output, there's mention of `--help-properties` and `--help-variables`, but the output is super long.
I'm not a huge fan of autotools either, I tend to just write Makefiles directly (and not make any attempt at portability to Windows). It's actually not hard to correctly handle parallel, cross-platform, and cross-compiling support, just for linux / os x / bsd. It's a lot easier than fixing a cmake or autotools project that apparently messed something up in their configuration ...
CMake do provide a UI of sorts for exploring the various options. You have to run cmake . once first to get the defaults generated, and then you can use ccmake . to get a UI.
I find its insistence on all caps everywhere to be a bit of a annoyance though. But it guess it inherits that from make.
Not coming from active c/cpp development, but learned it at University, I had to deal with autotools just last week. I have to admit I was facing a steep learning curve. All I wanted to do is adjusting some build parameters of an existing OS project (like adding compiler flags etc), which has several sub projects and is also calling non cpp-compilers (mono). Normally I'm an extremely fast learner, but this gave me something to chew on a few hours. I was frustrated and gave up when I accomplished 85% of what I originally wanted to achieve (because time didn't permit to spend more on this). I don't mean to say these tools are hard to comprehend/master in general, but for folks like me coming from other software ecosystems it seems it's a tough bit. At least it was for me :)
This documentation and video is good if you want to use autotools. But in 2016, you should really be asking yourself if autotools is the right choice. It is rather baroque, and there are many other choices these days that provide similar functionality for less developer time and effort.
To elaborate on "baroque": autotools spends a lot of time and effort on detecting behaviours which (a) no new Unix system has exhibited for at least two decades, and (b) your code almost certainly isn't going to be able to handle anyway.
Autotools was great once, but the world has moved on. Write code which is POSIX compliant and skip the whole mess.
As a user building packages I like autotools because of its uniformity. If I want to change the install root I use "--prefix". If I want to crosscompile I can set "--build" and "--host". If I need to set a compiler flag, autotools actually observes CFLAGS.
A few years ago I was trying to install a python extension and I couldn't figure out how to set a cflag on the native code it was compiling. It ignored CC, CFLAGS, etc. The documentation said nothing about it, super frustrating.
CMake is a million times better than autotools and respects prefix-setting, cross compiling etc in standard ways as well. Uniformity is not unique to autotools.
CPython extension building, however, sucks. setup.py is a bit of an abomination... It's really every language decides to reinvent the wheel for their own language. By now we could've had standards, and standard software, for package distribution and installation instead of having pip, go get, npm and a thousand others.
cmake is very nice. I used it in my last big c++ project. The cmake language is ugly but I think it's a law or something that build tools use ugly languages.
One advantage of autotools is that it distributes everything as a portable shell script, so the user doesn't have to have a autoconf, etc installed. That was an advantage when giving tarballs to users to run on crazy cluster environments, for example. Anything that calls itself unix has to have a bourne shell. Oh well probably not that big of an advantage these days.
I was really complaining more about dunno-works-on-ubuntu GNUMakefiles and 90s scripting languages extention builds.
CMake actually does platform probing (test compiling with code fragments to identify platform features and quirks) in a similar manner to autotools. But it is much easier to work with, and significantly better documented. I find the syntax shouty and verbose but it does the job, and has proven its mettle in many large, complex cross-platform projects.
As for CPython extensions, I can relate also. Though the cffi makes some of it redundant, and Boost::Python is brilliant for wrapping C++ projects.
It's a "bigger problem". New languages bring new build systems and there really isn't much you can do about that.
However, what could be standardized is package distribution. I shouldn't have to have 10 different package managers for 10 different languages, each of them with different ways of expressing essentially the same metadata, etc.
As a language developer, I shouldn't be expected to create my own version of a package manager, with download, local / remote search, versioning, vcs support, upgrades, hooks, and a million other things. Package managers are complex beasts.
It's a bit like if every javascript project was expected to create its own http server. Except it's not http, it's a weird custom protocol they invented just for the sake of it. Naaaaasty.
Arch Linux's "pacman" is exactly what you're looking for in those regards (to some degree). The only problem is that using pacman repositories instead of the language builtin ones means you're going to have issues with global installation and so on and so forth.
But you raise some really good points. I have to ponder this further.
Of course! pacman is one of the best package managers out there (and the best I've personally used). But its developers sadly don't share in the vision of using pacman outside Arch Linux. It's a wonder it works on msys2.
It's a point I tried to raise in the past but without much success. If you want to start a project to fix this, email me (especially if it's centered around pacman - I'm an arch TU).
I don't think you can skip the whole mess, though. Firstly, systems will have bugs, and standards will have holes, so you will have to check for specific behavior (in the case of POSIX, there's the added question of "what POSIX?")
The way forward, I think, is to periodically change the definition of what one can expect a system to have, and, after that, adjust configure scripts to assume that level is present.
Problem with that is that maintaining configure scripts for software isn't fun, so those for less used software will rot, anyways.
Also, that does nothing about the problem that building X requires several different versions of language L, plus language M, plus obscure language N, each of which basically get used as batch language, but got used because their respective developers were most comfortable with it.
Fixing that requires someone to spend time standardizing the build system across thousands of packages. That won't happen because it isn't fun, and not _that_ annoying for _that_ many people. Also, having fixed it, the probability of getting all upstream packages to accept your choices is zero.
And even if one were to do all of that, the problem that building Firefox requires libtiff, while Firefox doesn't handle TIFF images likely would remain.
Fixing that would require someone to properly engineer both the dependencies and the boundaries across thousands of open source packages. That won't happen, either. Extremely annoying issues such as (almost) circular dependencies (e.g. product P embeds a scripting language S, which uses library L, whose build system requires S) will be corrected, but nobody will either have the will to properly refactor all packages, the endurance to keep doing it, or the power to enforce it.
All of the supposed replacements usually have a better interface but way worse functionality. Even just simple things like setting the installation prefix and finding shared libraries don't work as well or are left out. I do a lot of distro packaging work, and the GNU build system is by far the easiest to deal with.
What the Autotools could benefit greatly from is a new UI. We can all agree that M4 sucks big time.
Im honestly curious why anyone still uses these tools.
Ive heard the 'alternatives do not work as well' argument before, but no one has ever managed to articulate to me exactly what it is that premake/cmake lack?
Is it literally just building debian packages in a moderately convenient manner?
Please don't use waf. It does not provide a stable API from version to version, and encourages projects to embed a binary compiled version of waf. Unlike autotools, where you can ship configure.ac and Makefile.am and expect developers to run autoreconf after obtaining the project from version control, you can't easily do the same thing with waf due to the lack of versioning.
See the compiled version linked from the waf homepage, which projects using waf include in their source tree. That compiled version consists of a small Python stub followed by bz2-compressed data and a signature.
I'm pretty baffled that people recommend CMake so often. I looked into it, and found:
* The worst scripting language ever, seemingly made by someone without even basic theoretical knowledge of language parsing. It's even worse than shell scripting.
* The same I-don't-care copypasta culture most autotools users seem to follow, but the free documentation was even worse (or at least back when I tried it). There's a book which I didn't have access to, though.
* Less enduser/packager friendly, with worse help texts, documentation, and features (seems to have caught up somewhat). Plus, everybody already knows how configure scripts work.
* The portability to Windows in reality means lots of if(WIN32) branches.
* The results are often brittle. This is also true of autotools in practice, but at least there is some info out there about how to write autoconf macros correctly. Even the .cmake files that ship with CMake seemed to be thrown together carelessly the last time I looked.
* Also, CMakeLists.txt must be the worst filename for a script I've ever seen. I'm only half serious on that one. I know it doesn't matter, but it bothers me terribly. It's ugly and misleading and makes me question the author's sense of aesthetics and basic competence.
CMake is terrible. Autotools is also terrible, but at least it's got an excuse. Just learn your damn tools. The real problem with autoconf is that nobody wants to invest any time learning about their build system and instead just copy-pastes it around, creating a brittle mess. From the CMake projects I've seen, it doesn't really solve this problem at all, creating a similar mess. And now your users have to install CMake and figure out how it works, and it doesn't even print a useful --help text.
Appreciate your opinion. Have been looking into CMake this evening, and my little impression was that it does not make the problem smaller, just different. The things I have been searching for, also did not yield that much high quality answers. Think I will stick to learn autotools enough, as that might come in more handy in general.
Managing seperate build files for each platform sucks, especially on big projects. Thats where CMake is useful to me. It simplifies a lot for our company, but to each his own.
Agree that CMakeLists.txt is a terrible build script name though haha
Yes, I am just starting something small and new. And am now jumping the cliff of learning C, gcc, make, autotools, and gnulib-tool, of course all at the same time. Talking about a heap overflow ;)
Is it correct that CMake is a replacement for GNU Make? So far, GNU Make has not posed a problem. Can CMake assume some roles of the other tools as well? I am looking to have this interesting ride a little less bumpy.
CMake does not take the place of GNU Make. It just generates your makefiles (or project files) for you. It handles all the hard parts, you just tell it what the files, libraries, and options you want are and it generates everything for you.
CMake and GNU Make have only the word 'Make' in common, CMake is an automatic build tool while the GNU Make is just a GNU version of plain old make tool. CMake is there to prepare a proper Makefile for the "GNU Make" to consume.
Whenever this gets posted, comments seem to focus on what to use going forward. It seems obvious for most projects there are better alternatives.
However whenever I see autotools documentation I think of only one thing: it's worth learning how this old system works because, even if you will never yourself use it for your own projects, so much code written by others over past decades requires autoconf, automake, often libtool, and sometimes pkg-config.
For me, understanding how these old hacks work is very important in getting a large majority of open source software projects to compile after I make modifications, e.g., removing code.
Oftentimes I think that people who use autotools do not truly understand _how it works_, they have only figured out _how to use it_. This is reasonable but the problem is that autotools is very brittle, and if it breaks they have little idea how to fix it.
CMake is just another automatic build tool like the Autotools but without the Bourne shell requirement and the one which easily produces XCode / Visual Studio project files like the QMake. So as far as the CMake is concerned, it's not particularly useful nor more time saving than Autotools really.
- autoconf is horrible but useful. It's bloated, slow, obtuse, and hard to use. But it encapsulates a portability layer which was useful 15 years ago... and less so today
- automake is horrible. A few simple rules in GNU Makefile syntax are good enough for the majority of projects. And it's easier to understand than automake
- libtool is horrible. It slows down every build by a factor of ~10. And you'd think that a program dedicated to building C programs would be written in C. You know, for speed. But no... it's written in shell. Horrible, horrible, shell. And it breaks the build. Pass "-L/foo -lbar"? It randomly converts it to an absolute path "/foo/bar.so". Pass in an absolute path so that you can link against the LOCAL build directory for testing? Screw you... it re-writes that to a relative path, and links against the system libraries.
- libltdl should be shot. I got rid of it in my projects 5 years ago, and never looked back. Everything sane has dlopen() these days.
If you want a nice cross-platform replacement for these tools, look at what nginx has done. ~8K lines of code, largely Makefiles split into ~50 lines. Simple, clean, and well organized.