Hacker News new | past | comments | ask | show | jobs | submit login

To use the article's terminology, I "flipped the table" about a year ago on build tools, when first learning React and ES6. Every tool in the JS world seems to want to be the center of its own ecosystem where all the other tools are just plugins to it. I was sad to see Babel go this way.

Like the author, I have something besides Javascript on the backend. In my case it's Python. Rather than use a Python-specific build system, however, I've had success using a Makefile as the top level build tool, which can directly call out to npm, pip, babel, uglify, sass, etc. There is no Gulp/Grunt/Broccoli/Webpack/Browserify in sight, and that makes me happy. I'm pretty sure Make will be around long after people have forgotten about Grunt.




It's funny to watch the node.js community because it's like watching years 1998-2006 in fast forward, except when some framework reaches the maturity tipping point instead of consolidating it is ejected from the community

I was trying to pull some dependencies a while ago and they were failing to build because at some point grunt decided to change the build file name.

I'd be usually fine with that except the only way these tools work is to install them globally so they can't coexhist even with themselves

Some of the tooling is so broken and awkward people got used to commit their /dist/ on github so at least I can skip bower grunt and all that and just get the minified file for many of libraries I need


> I'd be usually fine with that except the only way these tools work is to install them globally so they can't coexhist even with themselves

This is not true, the node community just doesn't advertise it for some reason. You can run npm install a specific version of grunt-cli and run it with ./node_modules/.bin/grunt


For some reason, a few years ago somebody got the bright idea that installing globally was a good idea. It became a "best practice" even though you can really run into problems with dependencies of dependencies. It really is important never to install globally if you want to keep control of your dependencies. It's also very important to always delete .node_modules every time you update your package.json file -- especially when deploying. This is the only way you will be able to control what versions of things you have and to track down problems. I like working with Node, but for things like this it still suffers from some immaturity. "It works for me" syndrome is very common.


+10 for these advices - another is to always lock the dependencies - either using precise static version numbers or npm-shrinkwrap.


You did the right thing. Software should be built using fundamental tools and not frameworks and libraries that are here for the next two years. Make is never going away, it's so fundamental that we'll be using it in 20 years.


Make is never going away, it's so fundamental that we'll be using it in 20 years.

I really hope not.

There is plenty to criticise about recent build tools for web projects. Many are absurdly over-engineered. Most have very short working lives. Almost all seem to have a crazy number of dependencies of their own drawn from a fragile ecosystem.

But there is also plenty to criticise in traditional Make. The archaic and cumbersome syntax means makefiles are hard to read, analyse and maintain. The completely static nature of a makefile also means it is ill-suited to rapidly evolving codebases where new files come and go almost by the minute as we refactor. We can do much better than Make.


I've got to disagree. After dealing with npm and grunt in an enterprise environment, MSBuild, Make and CMake start to look very very attractive.


Actually I agree with you there. I just think that's a little like saying after watching your kid's new favourite cartoon for the sixth straight time on Christmas day, the evening family movie you first saw in 1987 starts to look very attractive.


Well yeah, but continuing to use npm might be Stockholm Syndrome at this point!


I'm not sure what your specific gripe is with npm. It's one of the fastest and most capable package managers out there, is easily better than Python's Pip (which doesn't resolve dependencies) [0] and Go's go-get silliness for example.

If your problem is just 'lots of dependencies' that may break from under you, npm has a solution for that: 'npm shrinkwrap'.

[0] https://github.com/pypa/pip/issues/988


> The completely static nature of a makefile also means it is ill-suited to rapidly evolving codebases where new files come and go almost by the minute as we refactor.

Make is so static that there was a Lisp interpreter written in make. Oh wait...

There are several mechanisms in make to have your rules dynamically adapt to your codebase, e.g. $(shell ...), $(wildcard ...), and pattern rules (and that's not all of them).

You just need to put a little effort to actually read the documentation, not just stop at finishing one of the plethora tutorials that stop after showing variables usage.

> We can do much better than Make.

You mean, "we can do much better than my `make' knowledge". Of course we can.


Please be careful about assuming someone with a different opinion to yours is speaking from ignorance. I've been using Make for at least 20 years on a variety of projects and platforms. I'm well aware of what it can do.

Its syntax still looks like the love child of Perl and Haskell, so it is still unnecessarily difficult to write correct makefiles and to read, understand and modify existing makefiles.


But that's totally different charge than "make is static by nature, so you need to adjust it every time you add something".

`make' may have awful syntax, though I find it much more pleasant to describe build process than anything other on the market, most of which use general purpose languages, ending up as terrible to read.


OK, let's look at a practical example, something that frustrates me about many current web-related tools.

Tools like Sass and Watchify can monitor for file changes and automatically regenerate SCSS or Browserify outputs when known source files are modified. They also employ caching techniques to make that regeneration efficient, instead of rebuilding everything each time. This functionality is very helpful, and it is available with a simple shell command that works on any major platform.

However, at the moment such tools typically don't notice when new source files are added and automatically start tracking them as well; this generally remains true even if the new file is imported by an existing known file. Similarly, such tools may break if a file that was being watched gets deleted.

In practice, this means you get automatic and efficient rebuilds as long as you are only changing known files, but you probably have to restart a watch process manually if you add or delete files. Fixing this would be high on my wish list for a modern web build tool, given what we already have.

How would you set Make up to solve this problem?


First of all, I would avoid wildcards in compilation rules, using $^ instead, so any leftovers wouldn't hurt. In the case of compilation tool being too stupid, I would build a list of expected files out of known/present sources ($(wildcard ...), $(foreach ...), $(patsubst ...)), and then call `rm' before calling the dumb tool:

  rm -f $(filter-out $(EXPECTED),$(FOUND))


I appreciate the reply, but unless I've completely missed your point somehow, you didn't actually answer my question there. What I'm looking for is something akin to

    watchify src/index.js -o dist/index.js -t ...
or

    sass --watch src/index.scss:dist/index.css
that handles new or removed files within the source tree gracefully, not just modified files that are already known when you run the command.

Whether it's monitoring for changes like these examples, or hot reloading and browser sync'ing, or IDEs that rebuild incrementally in the background so they can give immediate feedback, modern tools that actively monitor for relevant changes and respond to them automatically are noticeably more efficient to work with than traditional tools that run once on demand. Make belongs to a time that has passed, and we could be more productive by retaining the concepts that are as relevant as ever but incorporating them into tools with the more dynamic behaviour that we enjoy in other tools today.


Make is actually really simple the problem is people approach it like an imperative programming language.


You and I have very different definitions of "really simple", I think.

The manual for GNU Make contains around 80,000 words. That is roughly the length of an average novel.


Gnu make isnt make but it is still simple, its composed of targets and dependencies. You dont understand every detail of how say a car works yet you still drive it.


>Every tool in the JS world seems to want to be the center of its own ecosystem where all the other tools are just plugins to it. I was sad to see Babel go this way.

Having multiple tools in a build process that each operate on source code, (re)parse it themselves into an AST, support differing features (ES6, async/await, etc), and have their own sourcemap-handling strategy is wasteful and especially frustrating when half of them have subtle bugs in doing these redundant steps. Babel doesn't have plugin support just because of ego.

> There is no Gulp/Grunt/Broccoli/Webpack/Browserify in sight, and that makes me happy.

Webpack and Browserify aren't Make competitors. They're tools for adding a module system to javascript, so you don't have to write your code in one file (or among several files that are hard-coded to be concatenated in a specific order and all share the same variable scope).


Apart from Make, I actually like Rake too, too. It's very Make-like and works similarly to Make, but it's in Ruby, which isn't a bad thing.


Unfortunately, that won't give you the level of HMR integration that webpack with it's dev server instance will give you. Which is, to say the least very nice when working on the front end, especially gui changes several steps into a process...

Just a couple months ago, I was working on the final screen in an 8 step process that has several seconds of delay (not react). Needless to say, it wasn't fun at all and probably took me several times as much effort. Yeah, getting an understanding of webpack, babel and the setup/configuration for a project can take a full week or more, but once you have it, it's more than made up for.


In the Python world also, Alex Clark recently blogged about exactly this: http://blog.aclark.net/2016/02/21/updated-django-website/

He shared his makefile: https://github.com/aclark4life/python-project


Do you manually invoke "make" every time you edit a file on the server? Nodejs has tools to automate this (and make it fast).


Most UNIXes already have a command line tool for this. It's called entr: http://entrproject.org


I'm surprised no one has mentioned editors yet. We are an emacs shop here so we all just use an after-save-hook embedded as a comment in the source to automatically trigger a rebuild, whenever you save the file. No need for a separate tool to watch files.


I have a make task that uses chokidar to watch changes to my JS and CSS. When one changes, chokidar then calls another make task to rebuild things.


I also built a Makefile-centric web development build workflow.

It was OK until a developer on my team complained because he only used Windows


I don't want to be mean but he wasn't a good cultural fit if he complained about that and should've been let go. Successful companies and products are built by like-minded people. You can't have someone on your team who can't build software because he uses Windows and complains about it. He could've used mingw, or used cygwin, or setup a Linux VM. Make is god.


I can't tell if this is sarcasm or not, but I appreciate it either way.


It was not sarcasm.


I don't want to be mean but he wasn't a good cultural fit if he complained about that and should've been let go.

That seems a very rash assumption with so little information.

Maybe the developer in question needs other software to do their job and some of that software only runs on a certain OS.

Maybe the advantages of having the right person on the team and able to use their preferred dev tools are a big overall win for the project.

Maybe running into portability issues with such a fundamental tool is a warning sign that the build process isn't as robust as it could be.

Maybe the whole thing can be fixed with a ten minute discussion and two-line edit to the makefile, and then you have the option of bringing other developers who prefer the same platform onto the team later as well.

Successful companies and products are built by like-minded people.

If you'd said something like "compatible", I might have agreed with you, but I see little evidence that complete enforced uniformity is necessarily a win in a field like software development. This sounds a lot like the kinds of managers who want to reduce everyone on a development team to interchangeable commodities, ignoring the inevitable reality that everyone's background and ability to contribute will be different. It also sounds a lot like the kinds of development teams that can spend weeks going down a dead-end path because no-one had enough varied experience that they anticipated the problem and thought to question the direction the project was going.


> Maybe the developer in question needs other software to do their job and some of that software only runs on a certain OS.

This can't be understated. How about, say, accessibility software? There's a reason that, say, visually impaired developers very often use Windows: because the accessibility tools are really, really good.

But, naturally, it's just not understanding the beauty of Unix. Ick.


Yes, if we're talking about web development here, there are loads of practical examples. Testing and debugging with browsers only available on certain platforms obviously requires that platform, and sometimes an emulation or VM version isn't close enough. Graphics software is frequently platform-specific, so if you have a front-end developer who also deals with logos, icons, animations, or any number of other graphical assets, they probably need a compatible system. Plenty of conference calling and desktop sharing tools are platform-specific, and you need to be using the same as whatever your clients or external collaborators use. The list goes on, but certainly for the three big areas I just mentioned Linux platforms are often far behind both Windows and OS X in what is available.


FWIW, of the dozen or so blind programmers that I know, the majority use Windows, but a few use OS X, and one uses GNU/Linux on the desktop (Fedora, running GNOME and the Orca screen reader). So all three OS's are usable for a blind programmer. A bigger problem would be insisting that everyone use the same IDE, if that IDE isn't accessible (like, say, the JetBrains family AFAIK).


I didn't say the others didn't have usable tools. But the investment in Windows's is significant and it shows.


> wasn't a good cultural fit

And I'd fire you if you used barfy corporate speak like this.

> should've been let go

How positively stalinist. Just so we are clear - a guy complained and you'd fire him for that, not maybe first reason with him? Don't complain when the company is razed to the ground because nobody talks to anybody lest they be taken by the nkvd.


Of course I'd fire him. These folks are building a Linux stack software and the guy complains he can't use make on Windows. It's not only he's not a cultural fit, he's also not a technical fit. He'll never understand the beauty of Unix tools. Fire early, fire often and build a team of like minded people around you. Nothing wrong with using corporate language either.


I normally agree with you most of the time but this is just too much.

Yes, you should have a chat with the guy but maybe it's as simple as installing MSYS2 on a Windows box or making sure he can run a VM for some tasks.


They aren't building Linux stack software. Re-read what you replied to:

> I also built a Makefile-centric web development build workflow.


> He'll never understand the beauty of Unix tools

The beauty of unix tools was that the model fit well with memory constrained machines of the time. After that came a mountain of hacks that just happens to still work and are used because nobody bothered to come up with a statically typed equivalent. It's far from beautiful, it just sorta works when it's not aimed at you foot.

> Fire early, fire often and build a team of like minded people around you.

It poisons the whole thing. Respect people and people will respect you and maybe they won't screw you over because you haven't given them a good reason to. Egomaniacally firing people for opinions will create infighting, it will cause people to start scheming and collecting dirt on coworkers to be used as a deflection for when the boss has one of his outbursts again. Supposedly most botched products at Microsoft are result of infighting between teams caused by stack ranking, which is effectively the same thing as what you are proposing.

> Nothing wrong with using corporate language either.

It isn't when it's not doublespeak. Culture is a nicer word for circlejerk. Finally, unix circlejerking is fine as long as you aren't a jerk about it..


Why not, if being too old or of the wrong race is a poor cultural fit.

Using Windows for development of apps ultimately to be deployed on Linux isn't any more sensible than using Linux when you are ultimately building Windows apps. Visual Studio is there for a reason. So is make.


Make runs under Cygwin, I believe.


Even when you have make in windows, there are often unix-isms that tend to go into other scripts as part of a build process.

My own process on windows, is usually to have a linux vm (autostart via hyper-v), that I SSH a few terminals into (conemu ftw), and work in linux... I share via smb/cifs from linux so I can use a gui editor... everything else is in the console, or browser.

Once you get used to it, it works out much better... I've also got most of the nix tools in windows, so some stuff I can just do there. My laptop and work issued laptops are rMBP, I also have a few linux systems/servers at home...

Frankly I don't care too much what OS I'm on... that said, it doesn't take that* much effort to ensure what you are doing can run across the big 3 platforms.


Use CMake, PSake, ninja or hell, even MSBuild on Windows, especially if you're going to be sharing stuff with other Windows developers. It's gotten easier with MSYS2, but it's still a real pain to get most stuff working under Windows.


Docker.


Docker requires a Linux host to run.


https://docs.docker.com/engine/installation/windows/

The OS X equivalent is a pretty reasonable experience, and Windows is supposedly a similar experience now. The only significant issue is that mounting a volume from the host gets tricky. But for many development workflows, that won't matter.


If the developer needs Linux, why not just use a Linux VM?

Keep in mind this hacky "docker-machine" solution is also just a linux VM, but even more complicated.


Is there anything you could share from this? My next project will probably use CMake or ninja as a build system, mainly because it needs to be able to work with Visual Studio.

Premake also looks interesting.


I just use npm scripts which calls Browserify. A very simple script which i used for many projects.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: