Hacker News new | past | comments | ask | show | jobs | submit login

> I agree that Make should be retired, and yet I'm typically the "Makefile expert" where ever I work.

I'm the Makefile expert in my software shop. I still use GNU Make simply because I haven't found anything "better enough" to justify switching a toolchain. CMake was probably the closest--and that mostly because I know that it works for KDE, so I should be able to learn by example.

What sort of complexity do you think shows Make's problems? It's just a big dependency graph, so IMHO the hardest part is defining the dependencies in a way that's accurate (for parallelism and incremental builds) without being redundant. Multi-versioned builds shouldn't really add any complexity beyond a single variable per input dimension. Complex serial sub-processes can easily be factored out into scripts. Platform detection, likewise, can easily be factored out into a combination of scxripts and multi-versioned builds.

So where have you seen it suck the most? I like to think I've done reasonably complex things with it, but maybe not.




>So where have you seen it suck the most?

When people (myself included) start taking advantage of the fact that Make is Turing-complete and writing arbitrary "programs" in their Makefiles.

It typically starts simple; you want to do something like build ALL the files in a folder, so you use a wildcard. Then you want to add dependency checking, so you use the wildcard to convert between .o to .d, keeping the same folder structure.

And I don't want the .o and .d files to be generated where the .c files live, so I need to add this code here that converts the paths.

OOPS, this project uses a slightly different folder structure, and so I need to add a case where it looks in DIFFERENT relative paths.

Oh dear; I just realized that I need this code to work ALMOST the same in a different project that needs to be built with the same Makefile; that means I need to include it twice, using different options each time.

And it turns out that it DOESN'T work the way I expect, so now I have to use $(eval), meaning some of my Make variables are referenced with $(VAR), and some with $$(VAR), depending on whether I want them to grab the CURRENT version of the variable or the calculated version.

But now, now I have all of my code to create my project in one convenient place, and creating a new project Makefile is quite trivial! It's all very clean and nice. But the next person to try to change the folder structure, or to otherwise try to get this now-crazy-complicated house of cards to do something that I didn't anticipate has to become just as adept at the subtleties of $(eval ...) and Makefile functions (define ...); error messages when you get things wrong tend to make early C and C++ compiler errors look straightforward and useful by comparison.

For a far more complicated example, take a look at the Android NDK Makefile build system. 5430 lines of .mk files that make your life very easy...right up until you want to do something they didn't anticipate, or until they ship a version with a bug (which they've done a several times now) that screws up your build.

Here's one small excerpt for your viewing pleasure, just to get the flavor:

http://pastie.org/6331932


> some of my Make variables are referenced with $(VAR), and some with $$(VAR), depending on whether I want them to grab the CURRENT version of the variable or the calculated version.

Hah, my latest Makefile work has been a set of functions which generate Make-syntax output, which then gets $(eval)ed. I hear you on the debugging nightmare that this can be: does a given variable get resolved when the function is first $(call)ed, when the block gets $(eval)ed, or when the recipe is invoked? But IMHO it's not too bad to do printf-style debugging. Replace $(eval $(call ...)) with $(error $(call ...)), then work backwards from there.

It also helps to be very disciplined about immediate assignment (`var := stmt`) and to always use recipe-local variables, rather than global variables.

I do feel like all of this aspect would be cleaner in Python or Lua... but the problem is, the _rest_ of the build, which more people interact with on a daily basis, gets more complex when that happens. Because there are always the ancillary targets and recipes where normal Makefile syntax works just fine.

Thanks for the NDK reference, I'm interested in seeing other "ugly" Makefile support infrastructure for comparison :)


I also use "printf debugging"; I have to.

The worst problem I had, though, was REALLY annoying; I was getting an inscrutable error in the middle of a function, and I could delete large parts of the code to get the error to go away, but putting ANY of the code back brought the error back -- it didn't matter which parts I put back.

It turned out that git had changed LF to CRLF in the file, and some end-of-line character was screwing up the spacing. Tweaking .gitattributes and fixing the files made everything work.

I SO hate significant white-space. I never really forgave Python for that "feature" either. But I could totally get behind Lua for the logic. :)

Actually, if it were my job, I would use LuaJIT to write a make replacement; the dependencies could all be specified in tables or extended strings, and any more complicated logic could be explicitly outside of the "rules".

>but the problem is, the _rest_ of the build, which more people interact with on a daily basis, gets more complex when that happens

I think a good design would NOT have that problem. You could have it say "these files get built by default rules" separately from "these rules trigger this bit of Lua code, which can spit out warnings, add dependencies dynamically (oh wouldn't THAT be nice!), or do this other bit of complicated build processing that doesn't fit well into the rule-based system".

If you're doing it in Makefiles, then yes, you could make everything more complicated that way. But I think a fresh design could really do a good job in killing make. I'm just so busy with other things right now, though...

Another reason I would STRONGLY choose Lua over any other scripting system is that the entire tool can embed Lua trivially, while Python or Ruby or Perl would each bring an entire ecosystem with it. You can have a dozen different Lua installs on your system without requiring a separate infrastructure for managing Lua installs.


> Another reason I would STRONGLY choose Lua over any other scripting system is that the entire tool can embed Lua trivially, while Python or Ruby or Perl would each bring an entire ecosystem with it.

Oh yeah, I like that idea. I'm just not so thrilled when I hear about modern build systems when they require me to install recent versions of relatively bulky scripting languages. I'm not a big fan of Lua in general, but this sounds like a perfect application.


So what? you are doing it wrong.

there are lot's of people that use PHP for data crunching and bash for GUI and C without caring for managing memory properly.

should we retire all languages that can be abused?

also, your idea of how $ and $$ is wrong. but it could be that you messing up with = and := before that point :) so i guess your point stands. but again, all languages can be abused.

blame the bad coder, not the tool.


> also, your idea of how $ and $$ is wrong

No, he's spot-on about that. If you are using a function from within a Makefile to generate Make code which then gets $(eval)ed, then then inner function must output $${variable} so that the outer function sees ${variable} and does not immediately resolve it.

It's hairy. Hairier than macros in C. But like any other specialization, it can potentially save an immense amount of time for the rest of the tam.


> messing up with = and :=

Sorry, but I wasn't messing up those two. That's Makefile 101 knowledge; I'm talking about crazy advanced stuff, where := doesn't work the way you expect.

Even := doesn't do what you want if, after the Makefile has been loaded and you've used := three times on the same variable, ALL the instances of that variable are replaced by the last assignment. Here's an example:

    FOO:=1

    rule1 :
        echo $(FOO)

    FOO:=2

    rule2 :
        echo $(FOO)
make rule1 and make rule2 both echo 2. $(FOO) is evaluated in both cases AFTER the Makefile is loaded.


Target specific variables do this job:

    rule1 : FOO:=1
    rule1 :
        echo $(FOO)

    rule2 : FOO:=2
    rule2 :
        echo $(FOO)


Interesting. Didn't know this trick.

Turns out it wouldn't work for the usage pattern I needed (my example was simplified -- typically the variable settings would all happen in another file, and they couldn't happen on a target line because there wouldn't be a single target to use, in addition to just being ugly for that use), but it's good to know.


> I still use GNU Make simply because I haven't found anything "better enough" to justify switching a toolchain.

For what it's worth, as much as I rag on Make, I also find myself using it most of the time. To paraphrase Churchil, it's the worst build system imaginable, except for all the others.

I would just really, really love a system like Make crossed with an imperative language for everything that doesn't fit well with the auto-dependency-tracking model. There's a product in there somewhere.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: