For very large projects the build system can quite easily become the bottleneck when just changing a single file, which also happens to be the most important use-case for developers. In extreme cases a no-op build with make can easily get to 15+ seconds.
> In extreme cases a no-op build with make can easily get to 15+ seconds.
I have never seen cases so extreme, but my opinion on the matter is that this is a "build smell". If the Makefile has to resolve a DAG this large, that means that developers have to worry about compile- or link-time interactions this large, as well. 100k source files all linked into a single executable is more complex than 10k source files split across 10 executables, and a handful (say <100) of headers which represent "public" APIs. Because if you have 100k source files and your developers haven't all killed themselves already, then there are some firewalls separating various modules already. Formalize it at an API level and split apart the builds, so that it's _impossible_ for anything outside of the API itself to trigger a full rebuild.
Typically this shows up in recursive make projects with lots of sub projects—it doesn't take that much time to stat every file in question but reinvoking make sixteen times can be quite slow.
I don't deal with this by not using make, I deal with this by not writing recursive makefiles.
> it doesn't take that much time to stat every file in question but reinvoking make sixteen times can be quite slow.
Yes, reinvoking Make repeatedly tends to force redundant `stat` calls. But I have worked in environments where heavily-templated code was hosted over a remote filesystem, and every `stat` call was something like 10msec. That adds up _extremely_ fast, even with non-recursive make. Ugh.
> In extreme cases a no-op build with make can easily get to 15+ seconds.
Most developers will never see such a system. Optimizing for that kind of scale at an early stage has all the problems of any other premature optimization. It's most important to just get the build system out of the way so you can get your real work done, and you do that by writing makefiles, since makefiles are universally understood.
Now, when a project does grow to the proportions you mention, you can start looking at alternatives --- but I'd argue that these alternatives should amount to more efficient ways to load and evaluate existing makefile rules, not entirely different build paradigms. Make's simplicity is too important to give up.