Hacker News new | past | comments | ask | show | jobs | submit login
Non-Recursive Make Considered Harmful: Build Systems at Scale (2016) [pdf] (ndmitchell.com)
113 points by setra on May 17, 2018 | hide | past | favorite | 76 comments



One should start with, "Recursive Make Considered Harmful" [0] to know where this is coming from.

The best architected use of make is in the FreeBSD build system [1,2] If you want to experience "a system" please give FreeBSD a try.

[0] http://aegis.sourceforge.net/auug97.pdf

[1] https://www.freebsd.org/doc/handbook/makeworld.html

[2] https://www.freebsd.org/doc/en/books/porters-handbook/portin...


Another interesting paper is "Build Systems à la Carte" https://www.microsoft.com/en-us/research/uploads/prod/2018/0... which explains characteristics of some build systems "static vs dynamic dependencies; local vs cloud; deterministic vs non-deterministic build rules; support for early cutoff; self-tracking build systems; and the type of persistent build information. ... We show that we can instantiate our abstractions to describe the essence of a variety of different real-life build systems, including Make, Shake, Bazel, and Excel, each in a dozen lines of code or so"


... which is by three of the same authors.


Is it really worth blaming the tool? Has reimplementing make for the 50th time really improved things?

The fact is, building software that requires 500 dependencies and 500 sub-steps and 500 configuration options is going to be complicated. It's complicated in the same way that implementing an operating system is complicated. There's no way around it. The complexity is there because it's inherent in the problem.

But it doesn't have to be. Instead of spending 300 hours implementing Shake, or Rake, or Bake, or Cake, or Jake, or Take, why not spend those hours cutting down the complexity at the source? Trim your dependencies. Stop putting so many sub-steps and configurations into your build systems. Cause it's the build systems with the 500 dependencies and 500 the sub-steps and the 500 configurations option that are harmful; not the tools


I don’t have 500 direct dependencies. I have four. But they each have five, and those have five, and those have five.

At any level, there’s probably a library that solves a large part of your problem. From your perspective, you only need “a few” of those. But from the library author’s perspective, they also only need “a few” of those. And on and on, down the chain. Each step seems reasonable, but suddenly the tree has 500 nodes.

Some library authors try to make sure their library doesn’t have dependencies. But that’s how you get libraries like Qt, that take two hours to build and produce 35MB .so files.


>Some library authors try to make sure their library doesn’t have dependencies.

I can't count the number of times i've looked at a potential dependency only to find that they have massive bugs in some auxiliary part of the library that they decided to do themselves in an attempt to "avoid dependencies".

We have these tools designed to handle a complicated network of dependencies, use them! Don't implement your own cute way of parsing XML just because you want to avoid dependencies! Let the library that is focused on just parsing xml, well tested, widely used, and actively maintained handle XML parsing!


They solved their own problem but not everyone else's. If–as they state–complex projects present challenges for Make, then rewriting their build system in a DSL of the language they understand (Haskell) solves their own pain point but not everyone else's. Take this to its logical conclusion, every large project goes away and re-implements Make using a DSL of a language favoured by the community. This seems to me to be locally optimal but globally sub-optimal. We have to learn a new DSL per-project just to tweak the build system. No thank you. It's a bad move in the meta-game.

Much better to reach out to other projects and figure out how to make Make better, no?

Sometimes I think great coders (10x coders) can work against themselves. Because they are such great coders they are able to re-implement in situ rather than improve the stuff they get from over the fence. If Make is really that bad make Make better in a language-neutral way.


I think it is highly unlikely that make would implement the features described in this paper. Furthermore the paper points out many fundamental design flaws in make (all variables are strings, the macro language is not a real PL) that I doubt will ever change.


Where there's a will, there's a way.


This doesn't address the question of whether the existing make community wants to move towards a proper PL


> Has reimplementing make for the 50th time really improved things?

Considering the authors did so and find it to be an improvement in terms of maintainability and usability... I'm going to say "yes". Do you think you know more about the project than they do?

I used to work on GHC. The build system is complex. Hadrian is quite an improvement in power and expressiveness (and is now capable of doing things we wouldn't have been able to implement easily with Make, since extending the prior system was too hard).

> The fact is, building software that requires 500 dependencies and 500 sub-steps and 500 configuration options is going to be complicated. It's complicated in the same way that implementing an operating system is complicated. There's no way around it. The complexity is there because it's inherent in the problem.

I get the feeling you're going to use this random truism as a springboard to make suggestions despite the fact you've never been involved in the project?

> But it doesn't have to be. Instead of spending 300 hours implementing Shake, or Rake, or Bake, or Cake, or Jake, or Take, why not spend those hours cutting down the complexity at the source? Trim your dependencies. Stop putting so many sub-steps and configurations into your build systems. Is that the sane way to do things?

That would be nice if everyone had endless time and everything was always done exactly perfectly up front. It would also be nice if you could work completely on your own and never have to interact with any other software in the world.

Binary tarballs, source distributions, upstream library dependencies, cross compilation, thousands of tests, tracking all dependencies correctly (this one alone is ridiculously hard), autogeneration tools (to save errors on tricky parts). Feature detection at compile and runtime (because your users work on some old CentOS machine and no `pthread_setname` is not available), profiling builds, running documentation generators, handling out-of-source builds, handling relocatable builds. I can just keep listing things, honestly. All of these -- more or less -- come back to your build system.

In fact, GHC goes quite out of its way to expressly use as few non-Haskell dependencies as possible. Why? Because the ones it already has are often burdensome and complex, and we have to pick up the slack for them for every user. Nobody using your project cares if Sphinx or their rube-goldberg Python installation (spread over 20 places in /usr) was the reason doc building failed; your build failed, that's all that matters. You've still got to figure out what's wrong, though, for your user. And not wanting new dependencies has been a common reason to reject things -- I myself have rejected proposals and "features" to GHC on this basis alone, more or less. ("Just use libuv!" was a common one that sounded good on paper and never addressed any actual issues we had that it claimed to 'solve'.)

As a side note, it really just amazes me the amount of people who immediately see any amount of non-trivial work in some project and immediately question "well, why don't you just do <random thing that is completely out of context and has no basis in the projects' reality>". Seriously, any time you think of this stuff, please -- just give it like, 10 more seconds of thought? You'd be surprised at what you might think up, what you might think is possible. It's not the worst thing in the job, but being an OSS maintainer and having to deal with analysis' that are, more or less, quite divorced from the reality of the project is... irritating.


> Considering the authors did so and find it to be an improvement in terms of maintainability and usability... I'm going to say "yes". Do you think you know more about the project than they do?

Every single self-described make replacement project makes the exact same claim, verbatim. Yet, when these projects start to see some use in the real world... Queue all the design shortcomings and maintainability and usability problems.

We're about 4 decades into this game. Perhaps this time everything is different. Who knows. Odds aren't good, though.


I think that being the build system for the Glasgow Haskell Compiler - which is the most commonly used compiler for Haskell - counts as "some use in the real world." I downloaded the source, did `git ls-files | xargs wc -l > wc.out`, then `grep "total" wc.out`, summed the totals, and it comes to 1051451. That's an overestimate in lines of code, as there's certainly documentation in there, but there's about 620,000 lines of Haskell.


Great, someone managed to get a build system to work for a project. That's nice. I'm sure there are a bunch of cases where even hand-written makefiles are being used to the same effect. Does that mean that any of those tools are free from any design issue or maintainability problem?


That's a good question! I think a good way to figure that out is to publish a paper about what they did, how it solved their non-trivial problem, and then invite others to try to use their tool and techniques to solve their problem.

In other words, you're asking a question that criticizes the mechanism that would answer that question. It's a legitimate question, but a poor reason to disregard what they did.


Well put. Parent seems to be the classic middlebrow dismissal.


Or perhaps parent is aware that the complexity of software is often a product of bad design?


Maybe my reference to "parent" was unclear. I was referring to the top-level comment.

For those who don't know, aseipp is/was a major contributor to GHC and will be intimately familiar with the build system of GHC. His observations are on point.

Relatedly, I'm currently also fighting about 3-4 different build systems which are "classics" of the genre and yet are broken in subtly different and interesting ways.

So... yeah.


I don’t understand this.

Software is always complex, it’s just that the complexity is gradually hidden from the developer by the use of libraries.

Until those libraries get baked into the standard library of whatever you’re using, you’re going to have to implement complexity yourself, or use a dependencies.

Unless you’re scripting, doing something entirely within your languages or OS framework, or implementing everything yourself (hello complexity), you’re going to hit complexity and dependencies very vey early.

The only time I’ve seen this avoided is in the embedded space where you physically don’t have enough bits to get complex.


Well that's just it. Embedded forces people to make different design decisions. We only have this mountain of shitty code because we've given ourselves enough rope to hang from.

We got to the moon with a computer less powerful than my microwave. My old smart phone worked just fine without 4 gigs of RAM and 32 gigs storage, and now this monstrosity in my hand is running out of resources? It doesn't have to be this way.


> We got to the moon with a computer less powerful than my microwave.

Can that computer show a GUI with multiple videos playing simultaneously surrounded by UI elements where multipile peripherials (mouse, touchscreen) can control their display area, all the while running two compilers (C++, Scala), and indidentally also running a Virtual Machine, etc. etc?

"Get to the moon" is an absurdly simplistic way to view complexity and it does your argument no favours.

(That's not to underplay getting to the moon. It's an amazing achievement, but if you look at the resources/humans poured into the project, it's actually not that amazing that it was possible.)


How dare you imply my Jake Take Shake'n Bake Rake Cake (JETSBRiCK) stack is too complicated!


Functional build systems are a great idea. The main problem I had implementing such a system was the file system. It's tough to capture all inputs and outputs, even with the best of intentions.

The other problem is integration. You can't expect all 3rd party projects to suddenly adopt your build system, so eventually you have to invoke `configure`, `make`, `cmake`, `pkg-config`, `xcode`, and so on. While you can satisfactorily capture most of these inputs and outputs, it's non-trivial to do it completely, at some point you have something that works and it's good enough.


I mean... Nix and Guix build entire systems off of the idea of capturing every single dependency in the build system, and they work in production


They are both interesting systems and I agree they try to capture every single dependency.

Now try adding Ragel into the mix, which doesn't have `-M` or any other kind of dependency output. There are plenty of other tools like this. Practical applications eventually run into limitations like this as they grow.


> I agree they try to capture every single dependency.

No... they do capture every dependency. They don't try. They do. Try getting something to build that has dependencies outside the nix store. You can't, because you can't access the local file system, the internet, or anything else while putting things into the nix store.

Nix doesn't care about `-M` output. It's a package manager that captures dependencies. If your package's build system doesn't have `-M`, then you must declare your dependencies yourself. This is how people used to do it before `-M`.


Do they even capture things like the C++ headers that your compiler uses? I think that's what he meant. When you compile a C++ file the compiler goes off and reads all kinds of files that are totally unknown to your build system.

Simpler languages like Go don't really have this problem, but they also have sane build systems so I don't know why you'd need CMake or whatever.


This is a long-ago solved problem, and the solutions are only getting better. GCC has the -M family of options which are explicitly designed to feed the header dependencies into the build system, and these flags have been around since the dark ages. More modern systems like Bazel will sandbox the build so if you try to #include a file that's not listed as a dependency you'll just get a compiler error. If you want, you can specify the compiler itself as a dependency.

Hermetic builds are the way to go for a ton of reasons.


It is a solved problem, but your compiler does not in fact implement an adequate solution with that mechanism.

* https://news.ycombinator.com/item?id=15044438

* https://news.ycombinator.com/item?id=15060146


> Do they even capture things like the C++ headers that your compiler uses?

Yes. Nix is a package manager that drives multiple build systems and captures all dependencies: https://nixos.org/~eelco/pubs/phd-thesis.pdf


They are not nessasarily totally unknown. If your build system also handles dependencies, then it might be the one providing those header files in the first place.

The problem comes when some part of your toolchain wants to access something outside of the build environment.


Yes because, IIRC, they use their own compiler installation with know set of files.


That's not at all how it works. The compiler emits dependency files that the build system can then read in on the subsequent passes to know if any dependencies changed. There's no special knowledge of "these are all of the headers in the world we care about".


It is patently how it works. All the files, not just headers, the build cares about have to be available in the build chroot.


I disagree. The compiler knows about possible include paths from the -I directive. Any include is searched using these. There are a few assumed directories on Unix such as "/usr/include", but the compiler doesn't inherently know about "all includes". If you doubt me, do an strafe on a compilation and just look at how it finds headers. It doesn't know anything, it searches the include paths.


I hadn't noticed this comment at the time. I said "patently" as in easy to check even if you think the docs are fiction, so for archives' sake:

GNU's not Unix and there isn't a /usr/include in the build chroot; see the manual about that and profiles.

  $ cat x.c
  #include <stdlib.h>
  $ guix environment gcc
  $ gcc -Wp,-H x.c |& grep stdlib.h
  . /gnu/store/fc363b1hsid7pfdxh18m4a1i1r04i5fl-profile/include/stdlib.h
  $ readlink . /gnu/store/fc363b1hsid7pfdxh18m4a1i1r04i5fl-profile/include/stdlib.h
  /gnu/store/4sqaib7c2dfjv62ivrg9b8wa7bh226la-glibc-2.26.105-g0890d5379c/include/stdlib.h


Yes, of course?


    While we have demonstrated that our approach works, 
    we have not yet implemented all features of the build
    system, and hope to do so over the next few months
This is a pretty major caveat. Almost damning in its significance, honestly. There are plenty of "works, but haven't quite implemented all of the old features" projects littering the world. I love that there are learnings here, and those should be seen as the most important artifact of any project. I do wish there were paths to get those learnings back to the old systems, though. :(


Table 1 in the paper looks promising, though. Let's hope their estimation is correct:

We implemented 5 a new build sys- tem for GHC from scratch using Shake and our build abstractions from §5. The new build system does not yet implement the full functionality of the old build system, but we are currently address- ing remaining limitations; nothing presents any new challenges or requires changes to the build infrastructure.


Well its in place now for ghc 8.6 ref https://ghc.haskell.org/trac/ghc/wiki/Status/Apr18

https://github.com/ghc/hadrian

Its mostly complete, and as someone that has used ghc's make system and likes make, this build system is miles better and isn't a chthulian horror.


Those guys are no amateurs, they work for Standard Chartered, one of the largest industrial users of Haskell that has been using Shake as its build system for a long time now. They have their own Haskell Compiler and >1 Million line Haskell codebase.


I'm not performing an ad hom. Nor do I want an appeal to authority. I fully wish them the best of luck in making a better build system, and I look forward to things they learn making it to other systems out there.

That line caught my attention pretty hard. It is mentioned in passing, and I have been part of many failed systems that were able to say the same thing. :) That is to say, I'm making my comment from experience. Not a desire for them to fail or give up.


To validate our claims, we have completely re-implemented GHC’s build system, for the fifth and final time.

Famous last words.


I chuckled -- given the authors, I think this was intentional irony.


Build, version control and package management: three problems perennially in search of a definitive solution.

For those who are sure we already have a definitive solution to one or more of these, the problem is in persuading everyone else, especially those who think something else is it.


Here are multiple examples of “definitive” build systems, by your definition. Many are over a decade old, and haven’t been standing still. These days, they generalize to many, many languages, and are working towards byte-for-byte reproducibility of the build artifacts:

https://reproducible-builds.org/who/


So this is a dumb question, but isn't the question of reproducible builds made trivial by vendoring dependencies? Why is it treated like this holy grail when it should be dirt simple? I must be misunderstanding something.


It's not a dumb question, but unfortunately, we live in a dumb^H^H^H^H interesting world.

Check out all the links under this heading: https://reproducible-builds.org/docs/#achieve-deterministic-...

Many legacy systems capture all sorts of nondeterministic values -- from build date (it might be a desire to be "helpful", but breaks reproducibility) to accidentally depending on the order of inodes on your files system!

All of these problems are solvable. It should be dirt simple. It's just a "small matter of programming" :)


Meh.

Linux and BSD build systems deal with most of these issues usually with wide support of a variety of recursive makes. Though RPM, DEB honestly suck and never really tried to solve issues automatically. Still drives me nuts that packages are tainted by the 'gold' systems they are built on. The complexity of build systems means very few minds are up for it and most solutions are naive and end up with tons of patchy exceptions and work arounds.

ROCK Linux supported cross compiler capabilities and auto-detection of build parameters and dependency library tracking. (I was working on automated dependency ordering and QEMU based full cross builds, before I got a real job.) It was very robust and outside package developers breaking their own builds it worked solidly. No idea what cool things T2 Linux got up to after ROCK, but maintaining a fresh build system is hard. Build systems are always going to be fragile systems with complexity. The paper seems to be a survey of what they learned vs. definitely having any solution.


I'd like to issue a blanket ban on the "considered harmful" thing, as well as sharing print-optimized PDFs on the web for people on computers to read.

Both things are archaisms that can be easily avoided, and in the particular case of this article, the 2nd part of the title works just as well as the title.


I am not generally in favor of bans, but if it came to that, I would first like to see a ban on complaints about the style and format of interesting material made freely available by the people who have already put considerable effort into creating it.


Style and format are important tho, would you like your docs written in haiku, or as a single three hour light opera video?

Embracing an archaic grandstanding pose and an academic look when much more easily grokkable formats are available just doesn't help get your message across that well. And when there's one of this roughly monthly I'd personally love to at least see less of them.



Dropping in with my make horrorstory. I once had the pleasure of interacting with a 20k makefile (builds any one of 15 or so projects for any platform because "splitting up the makefile would lead to code duplication since a lot of the makefiles would be similar") I'm told that makefile is over 40k lines today only a few years later.


>"splitting up the makefile would lead to code duplication since a lot of the makefiles would be similar"

So, they've never heard of templating?


Sorta reinventing the wheel?

https://nixos.org/nix/


One could say the same thing about nearly the entire nodejs ecosystem. Reinventing the wheel, and usually with lower quality to boot.

Sometimes it's okay to make something with redundant functionality.


How is it reinventing the wheel in that respect? Nix and Guix don't subsume the basic build systems of their packages.


My colleague @ejholmes wrote a cool tool that borrows heavily from `make` called `walk` (https://github.com/ejholmes/walk). It's written in Go and using a graph for dependencies so that tasks at the same horizontal level in the graph may be run at the same time. We use this at remind.com to significantly speed up our multi later AMI build times.


I've only used make a very little bit... but pretty much anyone can understand it for basic use.

This looks like you're writing source code in another language that follows a kind of template, which you then compile and run, and then does stuff and is extremely complicated.

Seems like a failure to me? Shouldn't something "better" be equally simple or simpler?


Example from the paper of Makefile code used GHC:

$1/$2/build/%.$$($3 _ osuf) : \ $1/$4/%.hs $$(LAX _ DEPS _ FOLLOW) \ $$$$($1 _ $2 _ HC _ DEP) $$($1 _ $2 _ PKGDATA _ DEP) $$(call cmd,$1 _ $2 _ HC) $$($1 _ $2 _ $3 _ ALL _ HC _ OPTS) \ -c $$< -o $$@ \ $$(if $$(ndstring YES,$$($1 _ $2 _ DYNAMIC _ TOO)), \ -dyno $$(addsux .$$(dyn _ osuf),$$(basename $$@)) ) $$(call ohi-sanity-check,$1,$2,$3,$1/$2/build/$$ * )

I suspect it easier to master Haskell than this.


This just feels like a bloody straw man, though. Poorly written file in X not as good as well written file in Y. News at 11. :)

Substitute "Clever" for "Poorly", if you'd prefer.


Would you like to show us a well written version of that rule?


First, I probably couldn't even do a well written version of the rule in any language. :) Not exactly my strength.

Second, sometimes the best trick you can do to make a build system cleaner, is to change the system it is building.

Though, as I stated elsewhere in this chain. I don't necessary mean to dismiss this effort. Just showing me gymnastics that are required to do something that most people just don't care to do, doesn't endear me to the ill suitability of the context. (Fun metaphor, actually. The fact that my house isn't designed to allow easy gynmastic practices in the living room is not a criticism of the house or of gymnastics. Showing me that a cartwheel will get you hurt there is not really showing me anything relevant.)


It's not basic use that's the problem.


It is interesting that none of their examples of why make is considered harmful involved writing an idiomatic make file.

In fact, all of their examples of how bad make is involve the use of non-standard makefile generators, or non-standard extensions to make itself.

It is sort of like arguing that JavaScript is harmful, and then showing snippets of asm.js code to talk about how “tragic” the language syntax is.

Alternatively, I could argue that no one should use C because C++ template metaprogramming is too opaque, and, as further proof, my use of a non-standard preprocessor I implemented leads to 10,000’s of lines of deeply nested macro invocations.

(There are all sorts of problems with make, but I’m not convinced the authors actually understand them in enough detail to improve on it.)


This looks like a good stab at avoiding all the pitfalls of Make while still providing the same or better capability.


It looks great but most of make-like tools are not functionnal. So yeah functionnal everything is better but nothing is functionnal right now.


Make itself is largely a functional/declarative language. Beside the recipes (which aren't Make but actually shell, though you can use Haskell if you want), the only non-functional feature I can think of are the global variables, which most makefiles generally don't modify in a non-lexical manner anyway.


I was talking about tool like rake in ruby and scons in python. they are not declarative at all. But most important is that if what your are building needs you to set env variable ect... I don t see how it could really be done in a functionnal way(without side effects).


Stick the ENV in a state Monad, and run the process with that ENV as the context?


There is no real monad in ruby. That why I feel like the article is missing the point.


What does this has to do with anything?

It's not like new build systems have to follow the paradigms of the old ones. Why do you think they write a new build system in the first place?


If you use a fonctionnal tool on top of a non functionnal one you will end up with a non fonctionnal tool. And make is not functionnal whatever you put on top of it won't make it functionnal.


I thought they replaced make, not used their tool on top of it…

And even then, make is mostly declarative.

And even then, Haskell is totally implemented on top of imperative tools (formerly C, C--, and now LLVM bytecode). It's still a functional language.


Haskell standard library mostly use haskell and NOT C or C--. If you use C to implement your language any small language in C can strongly impact yours, nobody wants that. By the way the purpose of C-- is the GHC(because you don't want to be tightly coupled with C).

All variables in make which are not target-specific, are global. That 100 % non functionnal.

> And even then, make is mostly declarative. The docker API is declarative but the reality is not, so everybody somehow use docker shell to do dirty stuff.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: