Hacker News new | past | comments | ask | show | jobs | submit login

- it has absolute minimum requirements for running on a target system - it requires only a POSIX shell

- the macro language it uses is strongly standardized, so it produces identical semantics on all platforms. And because it uses macros, which are expanded into shell code, it does not require the macro interpreter to be present on the target systems neither, only the generated script.

- the configure script is included in the source distribution and as such "frozen", so one has no problem with changes or incompatible upgrades which could break builds.

- the philosophy of autoconf is based on feature testing, for example if a given compiler version produces code with a certain property, if a library with the name "foo" contains a function with the name "bar", and so on. This is the right way. It is also good for providing upgrades between large libraries, like Qt.

- testing for features individually is the only way to escape a nuclear-scale combinatorial explosion of feature combinations.

- feature testing also allows for support of evolving software, new functions that are buggy first and need to be worked around, then gradually fixed and turning perhaps into a special library, then perhaps an element in Boost, later you need perhaps a specific version of boost, and so on. This is well-suited to the bazaar-style development which is characteristic for Unix, with an open interchange and borrowing of ideas.

- as already mentioned, it is very well documented. People who just copy-paste existing code are doing it wrong.

- it is understandable, without too much magic build into it. cmake is affected by an excess of magic IMO - it is hard to understand what is going on, and as a consequence, it is hard to fix a failing script. In my experience, a fraction of the time was needed to come up to speed in comparison to learning only some of cmake.

- This also means that it is easier to maintain for a medium-to-long-term future. It is nuts to use a language which is constantly changing precisely for infrastructure- some poor people will have to read and maintain all that code years later! (And, this people could even be you!).

- You only need to support platforms and features you need, at the same time you are able to support a myriad of exotic platforms if you want it. That means that if you want your software to work only on Ubuntu 64 bit Linux, you only need a very small configure.ac script, but if you want to support older libraries on Debian, you can do that easily as well, and if you need to support an exotic architecture with 36 bit-words or a 16-bit embedded MCU, you can do that as well.

- It is so widely used so that almost every Unix developer knows how to build software with it, without much thinking.

- It tries to do one thing, and do it well, providing platform-specific build configuration. It does not try to do dependency management or package management, leaving that to specific package managers.

- and because of the previous point, it also works well with new distributions and package managers, like Arch or NixOS or GNU Guix - it does not interfere with them, and does not fight for the position to be the top dog in a packaging system.

- Both autoconf and make are, while supporting C and C++, not language-specific, so you can easily use it to distribute things like LaTeX documentation, pictures, or even fonts.

- you can use and generate a mix of shared and static libraries, and also use both pkg-conf as well as standard paths (or, for example, paths provided by GNU Guix). And it can use that for each package individually. In contrast, cmake find_package commands do have difficulties with mixing different retrieval mechanisms.

- Supporting all kinds of hardware and compilers will continue to be important. While there are without question obsolete platforms, it will be able to support different hardware architectures in the future, like CUDA on GPUs, or ARM or RISC architectures.

- It works for any library without requiring specific upstream code. This is in stark contrast to cmake, which often needs supporting find_package commands or target definitions for specific libraries. The latter means ultimately that cmake would need to become a meta-distribution which requires specific support for many packages one wants to use. I think that as the complexity of software continues to grow, this approach will probably not scale.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: