This documentation and video is good if you want to use autotools. But in 2016, you should really be asking yourself if autotools is the right choice. It is rather baroque, and there are many other choices these days that provide similar functionality for less developer time and effort.
To elaborate on "baroque": autotools spends a lot of time and effort on detecting behaviours which (a) no new Unix system has exhibited for at least two decades, and (b) your code almost certainly isn't going to be able to handle anyway.
Autotools was great once, but the world has moved on. Write code which is POSIX compliant and skip the whole mess.
As a user building packages I like autotools because of its uniformity. If I want to change the install root I use "--prefix". If I want to crosscompile I can set "--build" and "--host". If I need to set a compiler flag, autotools actually observes CFLAGS.
A few years ago I was trying to install a python extension and I couldn't figure out how to set a cflag on the native code it was compiling. It ignored CC, CFLAGS, etc. The documentation said nothing about it, super frustrating.
CMake is a million times better than autotools and respects prefix-setting, cross compiling etc in standard ways as well. Uniformity is not unique to autotools.
CPython extension building, however, sucks. setup.py is a bit of an abomination... It's really every language decides to reinvent the wheel for their own language. By now we could've had standards, and standard software, for package distribution and installation instead of having pip, go get, npm and a thousand others.
cmake is very nice. I used it in my last big c++ project. The cmake language is ugly but I think it's a law or something that build tools use ugly languages.
One advantage of autotools is that it distributes everything as a portable shell script, so the user doesn't have to have a autoconf, etc installed. That was an advantage when giving tarballs to users to run on crazy cluster environments, for example. Anything that calls itself unix has to have a bourne shell. Oh well probably not that big of an advantage these days.
I was really complaining more about dunno-works-on-ubuntu GNUMakefiles and 90s scripting languages extention builds.
CMake actually does platform probing (test compiling with code fragments to identify platform features and quirks) in a similar manner to autotools. But it is much easier to work with, and significantly better documented. I find the syntax shouty and verbose but it does the job, and has proven its mettle in many large, complex cross-platform projects.
As for CPython extensions, I can relate also. Though the cffi makes some of it redundant, and Boost::Python is brilliant for wrapping C++ projects.
It's a "bigger problem". New languages bring new build systems and there really isn't much you can do about that.
However, what could be standardized is package distribution. I shouldn't have to have 10 different package managers for 10 different languages, each of them with different ways of expressing essentially the same metadata, etc.
As a language developer, I shouldn't be expected to create my own version of a package manager, with download, local / remote search, versioning, vcs support, upgrades, hooks, and a million other things. Package managers are complex beasts.
It's a bit like if every javascript project was expected to create its own http server. Except it's not http, it's a weird custom protocol they invented just for the sake of it. Naaaaasty.
Arch Linux's "pacman" is exactly what you're looking for in those regards (to some degree). The only problem is that using pacman repositories instead of the language builtin ones means you're going to have issues with global installation and so on and so forth.
But you raise some really good points. I have to ponder this further.
Of course! pacman is one of the best package managers out there (and the best I've personally used). But its developers sadly don't share in the vision of using pacman outside Arch Linux. It's a wonder it works on msys2.
It's a point I tried to raise in the past but without much success. If you want to start a project to fix this, email me (especially if it's centered around pacman - I'm an arch TU).
I don't think you can skip the whole mess, though. Firstly, systems will have bugs, and standards will have holes, so you will have to check for specific behavior (in the case of POSIX, there's the added question of "what POSIX?")
The way forward, I think, is to periodically change the definition of what one can expect a system to have, and, after that, adjust configure scripts to assume that level is present.
Problem with that is that maintaining configure scripts for software isn't fun, so those for less used software will rot, anyways.
Also, that does nothing about the problem that building X requires several different versions of language L, plus language M, plus obscure language N, each of which basically get used as batch language, but got used because their respective developers were most comfortable with it.
Fixing that requires someone to spend time standardizing the build system across thousands of packages. That won't happen because it isn't fun, and not _that_ annoying for _that_ many people. Also, having fixed it, the probability of getting all upstream packages to accept your choices is zero.
And even if one were to do all of that, the problem that building Firefox requires libtiff, while Firefox doesn't handle TIFF images likely would remain.
Fixing that would require someone to properly engineer both the dependencies and the boundaries across thousands of open source packages. That won't happen, either. Extremely annoying issues such as (almost) circular dependencies (e.g. product P embeds a scripting language S, which uses library L, whose build system requires S) will be corrected, but nobody will either have the will to properly refactor all packages, the endurance to keep doing it, or the power to enforce it.
All of the supposed replacements usually have a better interface but way worse functionality. Even just simple things like setting the installation prefix and finding shared libraries don't work as well or are left out. I do a lot of distro packaging work, and the GNU build system is by far the easiest to deal with.
What the Autotools could benefit greatly from is a new UI. We can all agree that M4 sucks big time.
Im honestly curious why anyone still uses these tools.
Ive heard the 'alternatives do not work as well' argument before, but no one has ever managed to articulate to me exactly what it is that premake/cmake lack?
Is it literally just building debian packages in a moderately convenient manner?
Please don't use waf. It does not provide a stable API from version to version, and encourages projects to embed a binary compiled version of waf. Unlike autotools, where you can ship configure.ac and Makefile.am and expect developers to run autoreconf after obtaining the project from version control, you can't easily do the same thing with waf due to the lack of versioning.
See the compiled version linked from the waf homepage, which projects using waf include in their source tree. That compiled version consists of a small Python stub followed by bz2-compressed data and a signature.