Hacker News new | past | comments | ask | show | jobs | submit login
Upgrading from React 0.11.2 to 0.14.7 in 347 easy steps (bernhardt.io)
141 points by logician_insa on Feb 27, 2016 | hide | past | favorite | 129 comments



I am not a JavaScript developer but was asked to help the JS team just the other week with an npm build issue. The npm modules built fine on OS X dev boxes, but CI server running Linux was failing because older version of gcc was installed. While troubleshooting this, I came to learn a bit about node.js and npm ecosystem and I surprised slightly more than I expected. Apparently, it is considered normal now to have build process download dozens of these npm modules. Many of them are build time dependencies. Things that are good old native binaries (e.g. for changing compression levels of graphics files) are downloaded by npm and built from source. Then when module runs, it tries to execute them by pulling in one of half a dozen possible modules for executing native binaries. There is no logging. All code is written in crazy continuation style, because the framework is inherently asynchronous -- nevermind that it only matters in a handful of hot IO cases requiring optimization -- all code is like that, undecipherable. There is no logging or error handling. A 3 months old project setup by smart consultants who do this kind of thing 24x7 already has a significant number of deprecated dependencies.


It's maddening.

>The real painful part of the process is caused by everything around React, including the build tools.

Can't agree with that more. I really like the way react works. It just makes sense and doesn't get in your way very often.

BUT it's incredibly frustrating dealing with npm dependency hell. Especially if you really are concerned about security and don't just trust that "someone else has probably made sure this code is safe, right?"

I just want to use something for longer than 6 months :(


You can perfectly use React without npm. Just download the prebuilt development and production versions of react and a simple <script src="..."> would do.


Ok, so what about JSX? Yes, you can write React templates without JSX, but not happily.


Right. React itself appears somewhat reasonable, except I am not entirely sure that putting entire app state into a huge clusterfuck of json is going to scale, but I am trying to reserve my judgement until I try it myself. And shadow DOM... sigh. It's a cool hack, but it feels like surrender to me.


FWIW, as an experienced developer whose instincts are probably quite similar to yours, I do recommend keeping an open mind with tools like React. They're different to a lot of more traditional tools, but like optimisations and profiling, they're best judged on their real world results rather than against dogma.

I'm not sure where you're coming from with the "huge clusterfuck of json" comment, but where the source data comes from is quite cleanly separated from how it's rendered using React, with one major caveat I'll come back to in a minute.

React itself is essentially a tool for rendering templates but with a declarative programming style, plus one fundamental performance optimisation that makes that declarative style fast enough to use in practice, plus a small number of hooks to integrate quite flexibly with other parts of your code and to allow for further performance optimisation if you need it.

Of course, there is no magic and there are always trade-offs. Most visibly, like any declarative system, sometimes the costs are significant and you do need to give it a hand to get acceptable performance. This is where the caveat I mentioned comes in, because you'll find a lot of projects using React also wind up using immutable data structures of one kind or another to store their model state. That lets you do fast comparisons based on identity rather than deep equality to see whether relevant parts of your underlying model data have changed, and thus lets you bypass the whole re-rendering process in areas of your UI where nothing interesting has happened. There are a handful of knock-on effects in practice that you have to deal with sometimes as well.

The upside, of course, is that you get much the same advantages as any other declarative programming style: your rendering logic tends to be simpler, more concise, more amenable to analysis, and less subject to surprising interactions and edge cases than more traditional, imperative rendering code tends to be, and all of these become disproportionately greater advantages as the complexity of the rendering and/or the underlying data model grow.


Yeah, our consultants set us up using immutable.js. Of course then it turned out that some components don't work with it, causing more craziness. By "huge clusterfuck of json" I meant that it appears that entire app state is combined into a single huge data structure, which has its positive sides (like ability to save and load state easily), but also sounds scary on an intuitive level.


Again FWIW, my experience is that Immutable.js is a decent library, though the documentation is not very helpful.

Storing application state within a single immutable tree might be counter-intuitive but doesn't really make much difference to anything important. That data was going to be stored somewhere anyway, access is reasonably efficient in JS, and as you say, sometimes it has useful practical advantages.

One thing I would say is that nothing actually requires you to store all of your data in a single immutable tree like that. I've found some patterns tend to come up often when designing UIs with this sort of architecture, and in those patterns a small number of immutable data structures are used instead of one monolithic one.

For example, sometimes it's advantageous to have one structure that contains strictly the underlying persistent state that you'll ultimately store in some database on the back end. If you also need some supporting data derived from that first structure for presentation purposes, you can still create a second, also immutable structure, that is basically a clone of the first but then with the supplementary data added. Doing this sort of manipulation is usually reasonably efficient with Immutable.js because you basically have copy-on-write semantics for everything, and the logic for deriving the supplementary data tends to be easily tested.


I really really think that documentation problems should be treated as high severity bugs. Are there any projects you know of that do that?


That would be Redux. It is one way of building React apps, but not the only one.


Well the truth is you can build React apps any way you like. React doesn't dictate anything beyond the rendering. Redux, with Redux-Immutable & Immutable.js is my current "stack" (hate that term) - but when I first got started I built a component library with barely any logic behind it. That's when I realised just how great React is.

It's made building my web apps front end fun again and bring in big changes is so much quicker than old school dom manipulation.

Took me a long time to become a believer, intuitively I rallied against frameworks, but React is amazing.


This is "normal" mostly to projects where the development team has little general development experience and in particular little exposure to how software is developed away from the JS ecosystem.

In fairness, plenty of people have long warned that recent trends in the JS ecosystem were unhealthy. It's always been possible to run useful modern tools like Browserify and SASS directly, without all the build systems and task runners and other such tools, and plenty of development teams do, and those teams don't run into NPM hell to the same extent.


Yep, just came out of a 3 month project. I was lucky that there weren't so many dependencies, but I still couldn't run the stylesheet build stuff because compass couldn't build a C extension or something. Count 'em, that's C, Ruby and Node, just to compile SASS to CSS.


at least you can keep it to just C and node if libsass (via node-sass) is good enough for you nowadays. libsass still doesn't support quite everything though :(


> Apparently, it is considered normal now to have build process download dozens of these npm modules.

Tons of NPM modules are small single functions with tests. NPM requires very little boilerplate to publish modules, so you get lots of small and focused (and hopefully composable) modules. Not every module is a megabyte+ one-stop kitchen-sink-containing toolkit that's mostly redundant with your other dependencies.

Though the linked article is specifically complaining about unmaintained tools in the sbt ecosystem. It doesn't appear he's using a build system using NPM modules.


Node has had problems with async-programming model and error-handling since the very beginning, but the ecosystem has evolved, and has a wide array of solutions to both these problems. It isn't anyone's fault that best-practices aren't followed. If the code is open-source, then its easier to fix because node-modules, more often than not, are "unix-ey" in that they do one thing, and hopefully, you expect that it does that well.

I think you'd find the real problem is with the tendency by the maintainers to abandon a lot of these modules...


To use the article's terminology, I "flipped the table" about a year ago on build tools, when first learning React and ES6. Every tool in the JS world seems to want to be the center of its own ecosystem where all the other tools are just plugins to it. I was sad to see Babel go this way.

Like the author, I have something besides Javascript on the backend. In my case it's Python. Rather than use a Python-specific build system, however, I've had success using a Makefile as the top level build tool, which can directly call out to npm, pip, babel, uglify, sass, etc. There is no Gulp/Grunt/Broccoli/Webpack/Browserify in sight, and that makes me happy. I'm pretty sure Make will be around long after people have forgotten about Grunt.


It's funny to watch the node.js community because it's like watching years 1998-2006 in fast forward, except when some framework reaches the maturity tipping point instead of consolidating it is ejected from the community

I was trying to pull some dependencies a while ago and they were failing to build because at some point grunt decided to change the build file name.

I'd be usually fine with that except the only way these tools work is to install them globally so they can't coexhist even with themselves

Some of the tooling is so broken and awkward people got used to commit their /dist/ on github so at least I can skip bower grunt and all that and just get the minified file for many of libraries I need


> I'd be usually fine with that except the only way these tools work is to install them globally so they can't coexhist even with themselves

This is not true, the node community just doesn't advertise it for some reason. You can run npm install a specific version of grunt-cli and run it with ./node_modules/.bin/grunt


For some reason, a few years ago somebody got the bright idea that installing globally was a good idea. It became a "best practice" even though you can really run into problems with dependencies of dependencies. It really is important never to install globally if you want to keep control of your dependencies. It's also very important to always delete .node_modules every time you update your package.json file -- especially when deploying. This is the only way you will be able to control what versions of things you have and to track down problems. I like working with Node, but for things like this it still suffers from some immaturity. "It works for me" syndrome is very common.


+10 for these advices - another is to always lock the dependencies - either using precise static version numbers or npm-shrinkwrap.


You did the right thing. Software should be built using fundamental tools and not frameworks and libraries that are here for the next two years. Make is never going away, it's so fundamental that we'll be using it in 20 years.


Make is never going away, it's so fundamental that we'll be using it in 20 years.

I really hope not.

There is plenty to criticise about recent build tools for web projects. Many are absurdly over-engineered. Most have very short working lives. Almost all seem to have a crazy number of dependencies of their own drawn from a fragile ecosystem.

But there is also plenty to criticise in traditional Make. The archaic and cumbersome syntax means makefiles are hard to read, analyse and maintain. The completely static nature of a makefile also means it is ill-suited to rapidly evolving codebases where new files come and go almost by the minute as we refactor. We can do much better than Make.


I've got to disagree. After dealing with npm and grunt in an enterprise environment, MSBuild, Make and CMake start to look very very attractive.


Actually I agree with you there. I just think that's a little like saying after watching your kid's new favourite cartoon for the sixth straight time on Christmas day, the evening family movie you first saw in 1987 starts to look very attractive.


Well yeah, but continuing to use npm might be Stockholm Syndrome at this point!


I'm not sure what your specific gripe is with npm. It's one of the fastest and most capable package managers out there, is easily better than Python's Pip (which doesn't resolve dependencies) [0] and Go's go-get silliness for example.

If your problem is just 'lots of dependencies' that may break from under you, npm has a solution for that: 'npm shrinkwrap'.

[0] https://github.com/pypa/pip/issues/988


> The completely static nature of a makefile also means it is ill-suited to rapidly evolving codebases where new files come and go almost by the minute as we refactor.

Make is so static that there was a Lisp interpreter written in make. Oh wait...

There are several mechanisms in make to have your rules dynamically adapt to your codebase, e.g. $(shell ...), $(wildcard ...), and pattern rules (and that's not all of them).

You just need to put a little effort to actually read the documentation, not just stop at finishing one of the plethora tutorials that stop after showing variables usage.

> We can do much better than Make.

You mean, "we can do much better than my `make' knowledge". Of course we can.


Please be careful about assuming someone with a different opinion to yours is speaking from ignorance. I've been using Make for at least 20 years on a variety of projects and platforms. I'm well aware of what it can do.

Its syntax still looks like the love child of Perl and Haskell, so it is still unnecessarily difficult to write correct makefiles and to read, understand and modify existing makefiles.


But that's totally different charge than "make is static by nature, so you need to adjust it every time you add something".

`make' may have awful syntax, though I find it much more pleasant to describe build process than anything other on the market, most of which use general purpose languages, ending up as terrible to read.


OK, let's look at a practical example, something that frustrates me about many current web-related tools.

Tools like Sass and Watchify can monitor for file changes and automatically regenerate SCSS or Browserify outputs when known source files are modified. They also employ caching techniques to make that regeneration efficient, instead of rebuilding everything each time. This functionality is very helpful, and it is available with a simple shell command that works on any major platform.

However, at the moment such tools typically don't notice when new source files are added and automatically start tracking them as well; this generally remains true even if the new file is imported by an existing known file. Similarly, such tools may break if a file that was being watched gets deleted.

In practice, this means you get automatic and efficient rebuilds as long as you are only changing known files, but you probably have to restart a watch process manually if you add or delete files. Fixing this would be high on my wish list for a modern web build tool, given what we already have.

How would you set Make up to solve this problem?


First of all, I would avoid wildcards in compilation rules, using $^ instead, so any leftovers wouldn't hurt. In the case of compilation tool being too stupid, I would build a list of expected files out of known/present sources ($(wildcard ...), $(foreach ...), $(patsubst ...)), and then call `rm' before calling the dumb tool:

  rm -f $(filter-out $(EXPECTED),$(FOUND))


I appreciate the reply, but unless I've completely missed your point somehow, you didn't actually answer my question there. What I'm looking for is something akin to

    watchify src/index.js -o dist/index.js -t ...
or

    sass --watch src/index.scss:dist/index.css
that handles new or removed files within the source tree gracefully, not just modified files that are already known when you run the command.

Whether it's monitoring for changes like these examples, or hot reloading and browser sync'ing, or IDEs that rebuild incrementally in the background so they can give immediate feedback, modern tools that actively monitor for relevant changes and respond to them automatically are noticeably more efficient to work with than traditional tools that run once on demand. Make belongs to a time that has passed, and we could be more productive by retaining the concepts that are as relevant as ever but incorporating them into tools with the more dynamic behaviour that we enjoy in other tools today.


Make is actually really simple the problem is people approach it like an imperative programming language.


You and I have very different definitions of "really simple", I think.

The manual for GNU Make contains around 80,000 words. That is roughly the length of an average novel.


Gnu make isnt make but it is still simple, its composed of targets and dependencies. You dont understand every detail of how say a car works yet you still drive it.


>Every tool in the JS world seems to want to be the center of its own ecosystem where all the other tools are just plugins to it. I was sad to see Babel go this way.

Having multiple tools in a build process that each operate on source code, (re)parse it themselves into an AST, support differing features (ES6, async/await, etc), and have their own sourcemap-handling strategy is wasteful and especially frustrating when half of them have subtle bugs in doing these redundant steps. Babel doesn't have plugin support just because of ego.

> There is no Gulp/Grunt/Broccoli/Webpack/Browserify in sight, and that makes me happy.

Webpack and Browserify aren't Make competitors. They're tools for adding a module system to javascript, so you don't have to write your code in one file (or among several files that are hard-coded to be concatenated in a specific order and all share the same variable scope).


Apart from Make, I actually like Rake too, too. It's very Make-like and works similarly to Make, but it's in Ruby, which isn't a bad thing.


Unfortunately, that won't give you the level of HMR integration that webpack with it's dev server instance will give you. Which is, to say the least very nice when working on the front end, especially gui changes several steps into a process...

Just a couple months ago, I was working on the final screen in an 8 step process that has several seconds of delay (not react). Needless to say, it wasn't fun at all and probably took me several times as much effort. Yeah, getting an understanding of webpack, babel and the setup/configuration for a project can take a full week or more, but once you have it, it's more than made up for.


In the Python world also, Alex Clark recently blogged about exactly this: http://blog.aclark.net/2016/02/21/updated-django-website/

He shared his makefile: https://github.com/aclark4life/python-project


Do you manually invoke "make" every time you edit a file on the server? Nodejs has tools to automate this (and make it fast).


Most UNIXes already have a command line tool for this. It's called entr: http://entrproject.org


I'm surprised no one has mentioned editors yet. We are an emacs shop here so we all just use an after-save-hook embedded as a comment in the source to automatically trigger a rebuild, whenever you save the file. No need for a separate tool to watch files.


I have a make task that uses chokidar to watch changes to my JS and CSS. When one changes, chokidar then calls another make task to rebuild things.


I also built a Makefile-centric web development build workflow.

It was OK until a developer on my team complained because he only used Windows


I don't want to be mean but he wasn't a good cultural fit if he complained about that and should've been let go. Successful companies and products are built by like-minded people. You can't have someone on your team who can't build software because he uses Windows and complains about it. He could've used mingw, or used cygwin, or setup a Linux VM. Make is god.


I can't tell if this is sarcasm or not, but I appreciate it either way.


It was not sarcasm.


I don't want to be mean but he wasn't a good cultural fit if he complained about that and should've been let go.

That seems a very rash assumption with so little information.

Maybe the developer in question needs other software to do their job and some of that software only runs on a certain OS.

Maybe the advantages of having the right person on the team and able to use their preferred dev tools are a big overall win for the project.

Maybe running into portability issues with such a fundamental tool is a warning sign that the build process isn't as robust as it could be.

Maybe the whole thing can be fixed with a ten minute discussion and two-line edit to the makefile, and then you have the option of bringing other developers who prefer the same platform onto the team later as well.

Successful companies and products are built by like-minded people.

If you'd said something like "compatible", I might have agreed with you, but I see little evidence that complete enforced uniformity is necessarily a win in a field like software development. This sounds a lot like the kinds of managers who want to reduce everyone on a development team to interchangeable commodities, ignoring the inevitable reality that everyone's background and ability to contribute will be different. It also sounds a lot like the kinds of development teams that can spend weeks going down a dead-end path because no-one had enough varied experience that they anticipated the problem and thought to question the direction the project was going.


> Maybe the developer in question needs other software to do their job and some of that software only runs on a certain OS.

This can't be understated. How about, say, accessibility software? There's a reason that, say, visually impaired developers very often use Windows: because the accessibility tools are really, really good.

But, naturally, it's just not understanding the beauty of Unix. Ick.


Yes, if we're talking about web development here, there are loads of practical examples. Testing and debugging with browsers only available on certain platforms obviously requires that platform, and sometimes an emulation or VM version isn't close enough. Graphics software is frequently platform-specific, so if you have a front-end developer who also deals with logos, icons, animations, or any number of other graphical assets, they probably need a compatible system. Plenty of conference calling and desktop sharing tools are platform-specific, and you need to be using the same as whatever your clients or external collaborators use. The list goes on, but certainly for the three big areas I just mentioned Linux platforms are often far behind both Windows and OS X in what is available.


FWIW, of the dozen or so blind programmers that I know, the majority use Windows, but a few use OS X, and one uses GNU/Linux on the desktop (Fedora, running GNOME and the Orca screen reader). So all three OS's are usable for a blind programmer. A bigger problem would be insisting that everyone use the same IDE, if that IDE isn't accessible (like, say, the JetBrains family AFAIK).


I didn't say the others didn't have usable tools. But the investment in Windows's is significant and it shows.


> wasn't a good cultural fit

And I'd fire you if you used barfy corporate speak like this.

> should've been let go

How positively stalinist. Just so we are clear - a guy complained and you'd fire him for that, not maybe first reason with him? Don't complain when the company is razed to the ground because nobody talks to anybody lest they be taken by the nkvd.


Of course I'd fire him. These folks are building a Linux stack software and the guy complains he can't use make on Windows. It's not only he's not a cultural fit, he's also not a technical fit. He'll never understand the beauty of Unix tools. Fire early, fire often and build a team of like minded people around you. Nothing wrong with using corporate language either.


I normally agree with you most of the time but this is just too much.

Yes, you should have a chat with the guy but maybe it's as simple as installing MSYS2 on a Windows box or making sure he can run a VM for some tasks.


They aren't building Linux stack software. Re-read what you replied to:

> I also built a Makefile-centric web development build workflow.


> He'll never understand the beauty of Unix tools

The beauty of unix tools was that the model fit well with memory constrained machines of the time. After that came a mountain of hacks that just happens to still work and are used because nobody bothered to come up with a statically typed equivalent. It's far from beautiful, it just sorta works when it's not aimed at you foot.

> Fire early, fire often and build a team of like minded people around you.

It poisons the whole thing. Respect people and people will respect you and maybe they won't screw you over because you haven't given them a good reason to. Egomaniacally firing people for opinions will create infighting, it will cause people to start scheming and collecting dirt on coworkers to be used as a deflection for when the boss has one of his outbursts again. Supposedly most botched products at Microsoft are result of infighting between teams caused by stack ranking, which is effectively the same thing as what you are proposing.

> Nothing wrong with using corporate language either.

It isn't when it's not doublespeak. Culture is a nicer word for circlejerk. Finally, unix circlejerking is fine as long as you aren't a jerk about it..


Why not, if being too old or of the wrong race is a poor cultural fit.

Using Windows for development of apps ultimately to be deployed on Linux isn't any more sensible than using Linux when you are ultimately building Windows apps. Visual Studio is there for a reason. So is make.


Make runs under Cygwin, I believe.


Even when you have make in windows, there are often unix-isms that tend to go into other scripts as part of a build process.

My own process on windows, is usually to have a linux vm (autostart via hyper-v), that I SSH a few terminals into (conemu ftw), and work in linux... I share via smb/cifs from linux so I can use a gui editor... everything else is in the console, or browser.

Once you get used to it, it works out much better... I've also got most of the nix tools in windows, so some stuff I can just do there. My laptop and work issued laptops are rMBP, I also have a few linux systems/servers at home...

Frankly I don't care too much what OS I'm on... that said, it doesn't take that* much effort to ensure what you are doing can run across the big 3 platforms.


Use CMake, PSake, ninja or hell, even MSBuild on Windows, especially if you're going to be sharing stuff with other Windows developers. It's gotten easier with MSYS2, but it's still a real pain to get most stuff working under Windows.


Docker.


Docker requires a Linux host to run.


https://docs.docker.com/engine/installation/windows/

The OS X equivalent is a pretty reasonable experience, and Windows is supposedly a similar experience now. The only significant issue is that mounting a volume from the host gets tricky. But for many development workflows, that won't matter.


If the developer needs Linux, why not just use a Linux VM?

Keep in mind this hacky "docker-machine" solution is also just a linux VM, but even more complicated.


Is there anything you could share from this? My next project will probably use CMake or ninja as a build system, mainly because it needs to be able to work with Visual Studio.

Premake also looks interesting.


I just use npm scripts which calls Browserify. A very simple script which i used for many projects.


> One thing seems increasingly clear to me: this way of building software is not sustainable. What can we do?

Waiting 530 days before upgrading and then only spending a day to upgrade what is effectively three major versions actually seems very reasonable to me.


Yeah, he says that a single day lost isn't really a big deal, although he knows it's going to happen again.

The point of the article is that (to take his example) in Java world, as you get used to the ecosystem, things get easier and easier. It becomes comfortable.

au contraire in the web/javascript world, things keep changing but they're still just as hard.


Except React has been getting simpler with each release. First there was React.createClass. Then came ES6 classes. Now components can just be pure functions.

Also, the author is using a Scala toolchain to build the JS frontend code. The standard toolchains (with the exception of Google Closure and Facebook's Flow) are all built around node.js.

There are even a number of code migrations available for React: https://github.com/reactjs/react-codemod -- but of course even those can't help you if you use non-standard tooling and let a web frontend codebase collect dust as browsers (and your dependencies) march forward.

Projects are never finished. If you cease working on a project, it just becomes unmaintained code.

EDIT: As someone seems to have flagged my account and I'm now apparently limited to two replies per hour or something ridiculous like that (yay for passive aggressive moderating tools?) my reply comes as an edit:

> That's a pretty shallow definition of easier.

It's a library that lets you describe DOM subtrees based on application state. How much simpler could it possibly be?

If you mean the toolchain, the keynote at React.js Conf made it clear that improving that part is a major goal for 2016 but none of that will help you if you want to solve everything with your existing Scala tools.


> Except React has been getting simpler with each release. First there was React.createClass. Then came ES6 classes. Now components can just be pure functions.

That's a pretty shallow definition of easier.


Agreed - it seems like the largest problem they faced was using Scala build tools - so it's funny the blame is placed on Javascript.


>If you mean the toolchain, the keynote at React.js Conf made it clear that improving that part is a major goal for 2016

Then people will just complain there's yet another build tool, because progress or any change is bad!


If your only method of improving something is to rewrite it or add another layer then something has gone horribly wrong and people have a right to complain about it. I sincerely hope that the improvment's are not a rewrite or an additional layer to an already dangerously complex system.


React has actually done a great job of providing clear upgrade paths, assuming you don't try to jump several major versions at a time.

They generally provide deprecation warnings at least 1 major version before actually breaking/removing an existing API. They have also started providing codemod scripts to help migrate a large codebase (though I've found find/replace sufficient for a lot of them)

My bigger problem has been relying on 3rd party components that hold me back due to some incompatibility they don't address for months.


The biggest pain I've experienced is updating react-router. I've run into cryptic failures each time, and googling usually turns up answers for the previous (or even next/beta) version.


Agreed. It got so bad, that at work I ended up becoming part of the problem and wrote my own isomorphic router for React[0]

[0] https://github.com/studionone/react-isorouter


Babel5/6 too!


the API churn on react-router really tests my patience.


not only that, loss of features like named urls...


Ditto


> My bigger problem has been relying on 3rd party components that hold me back due to some incompatibility they don't address for months.

Welcome to dependency hell :) It happens with all frameworks. The only solution is to use vanilla JS components.


It seems to happen more in the Node world. I can pull down what I need with NuGet for C# and not have too many problems.

Also, reducing the number of external dependencies your project has is always a good idea.


I use clojurescript to interact with React. Luckily with a small community we aren't overrun by build tools! I've recently taken up the boot[1], and it's amazing. I'm able to quickly get up a live-reloadable environment wherein I can actually _understand_ what is going on.

    cd my_project
    boot dev
'boot dev' is a task that's defined in the project's build.boot file. It compiles the cljs files to a main.js file, and serves it from a specified directory.

Any edit to a cljs file triggers a recompile (of that namespace only - takes about 0.1 second). If the recompile is unsuccessful then a buzz sound plays. If the recompile is successful, boot will play a nice subtle ding sound, and reload the main namespace. With immutable datastructures everywhere, arranging the app's state to allow reloading is simple.

How simple? There's a fantastic tutorial about how to setup everything I've just described at: [2].

This template will spit out all the code needed for the live-reload-cljs thing [3]. Once you install boot (with 'brew boot-clj' it's on apt-get too), You can run it with

    boot -d seancorfield/boot-new new -t tenzing -a +reagent -n boot-cljs-project
    cd boot-cljs-project
    boot dev
    ;; then edit the cljs file.
[1] http://boot-clj.com

[2] https://github.com/magomimmo/modern-cljs/blob/master/doc/sec...

[3] https://github.com/martinklepsch/tenzing


I no longer develop application code. Instead I am a professional fighter of JavaScript build systems.

Actually not strictly true. I spend about one day per week writing application code and the rest of my time trying to work out why some npm module that I need to use won't compile.


This is exaggerated a bit but I definitely do spend more time than I should fighting npm and various dependencies.


This is yet another reason to avoid using frameworks you don't really need. Just because someone uses React, and there's buzz around this framework, doesn't mean you've to use it.

Talking from my experience frameworks are one of the hugest anti-patterns in software development. They're hard to learn, they limit your creativity, they go out of business, you've to upgrade them for no good reason and make your code compatible with the latest version, you've to search for help and ask others, they bloat your software and create unnecessary dependencies, and you probably need only 10% of what a framework offers. They just don't make sense.

You can easily implement your own abstractions for your own application and be done with it. Your abstractions won't need upgrades, you'll be in control, and you won't have to waste time searching for help. You should be fighting complexity and not embracing it. I don't use frameworks and I encourage you don't use them too.


React is a library, not a Framework, and the core library itself is actually very small. In fact, part of the recent changes (splitting React into React and ReactDOM) have been in order to keep the core React library as small as possible.

React itself doesn't do very much at all, but when you're using React, you're using almost everything React has to offer, and then can choose what you want to add on top of that.

Take a look at the top level API [1] and compnenent API [2]. This is React. It's small, and it's compact - there's not a lot to it.

[1] https://facebook.github.io/react/docs/top-level-api.html

[2] https://facebook.github.io/react/docs/component-api.html


> I don't use frameworks and I encourage you don't use them too.

Instead you expect every new team member to wade through your homegrown framework-equivalent which means no transferable skills, less real-world testing, likely inferior documentation and no online community to turn to for help if you ever leave. Yeah, nice one.


Dude, you're talking about JavaScript. Documentation is most likely to be superior for shop-made tools, and the community will be at arm's reach.

And do you really think seeing someone else's code means no transferable skills? Sure, you can't put it in CV, but on the other hand, you'd see more than merely using some random one-function module.


> you probably need only 10% of what a framework offers.

Clearly you have not used React. You can't just use 10% and get any value out of it. It's typically 80% or more.


Though the effect you describe is partly because React isn't trying to be an all-things-to-all-people framework. It's a library, with a specific role and much tighter scope, and it's a small fraction of the size and complexity of something like Angular.


While it is possible to avoid using frameworks, it is not practical to avoid using powerful and expressive patterns. React is implementing several patterns. At least 2 of them are: 1) unidirectional flow of state management 2) server/client side isomorphism (meaning that a server side solution can render out a good number of client-side views)

There are other libraries (not frameworks) that more and more do similar things. And some of them actually offer 'api' and/or source-code level compatibility amongst themselves. Further indicating that these libraries are offering patterns.

For example Inferno JS with inferno-component https://github.com/trueadm/inferno


React is not really a framework in the same sense Angular, Django, or Rails are. It does not dictate how you should architect your app.


It does not dictate how you should architect your app.

This is a little optimistic in practice, because while React is often described as being just the V in MVC, it also has significant scaling issues out of the box. Useful strategies for keeping the performance acceptable can have profound implications for the wider architecture of your application.

For example, it seems many in the React community are moving towards using immutable data for the underlying model and pure render components with React. That means you can write efficient shouldComponentUpdate checks. However, it also means you have to design your version of whatever the "M" and "C" become similar to how you'd structure a functional program, which is not at all natural or idiomatic in JS.


Slightly OT, but why are you calling unmountComponentAtNode before render? That seems like a bad idea. ReactDOM.render will update in-place, rather than tearing the whole thing down and instantiating new versions of everything.

https://facebook.github.io/react/docs/top-level-api.html#rea...


I don't think the issue here is that it requires work to upgrade across multiple major versions. Rather, it is that "two major versions back" is only about 1.5 years ago, and that major building blocks got abandoned in the mean time.

If you have projects that do not see any updates for a few years (fairly common, in my experience), then see urgent requests come in (in this example, it was that a Chrome update broke the app, but it might also be a feature request. Normally, those shouldn't require upgrading technology to new versions, but there may be security fixes that didn't get backported, or it may no longer be feasible to get people who know how to use that 'ancient' version), that can get annoying fast.

Lesson learned should be that one should not blindly use the latest, shiniest, technologies, but take longevity of the product into account. Problem for web-based stuff is that there is not really a stable product that one can count on to be around mostly unchanged in a few years time, and that is best in class or at least close to it. Evolution is just too rapid.


Note that in the first code example you can still use JSX in place of a manual call to React.createElement (which is exactly what JSX compiles to: https://babeljs.io/repl/#?experimental=false&evaluate=false&... )


This is one thing that I find great about working with ClojureScript. My team has been using the Reagent library (https://reagent-project.github.io/) that uses React internally. Its API has stayed stable through all the versions of React, and all I had to do to get the benefits of the latest React versions was to update the dependency version in my project.


Tldr: sbt made it difficult.


Yeah, I stopped reading the second I saw "SBT." I know there are lots of Scala programmers who use it, but none of the ones I know do. (Having worked at Twitter for 4.5 years, that's a not-insignificant value of n.) I've heard nothing but bad things about it.


SBT is fine, but using it to build a complex JS project that's using immature dependencies is just asking for trouble. Webpack alone is tricky enough...


The only React-related problems the article describes are API changes (which are generally announced via deprecation messages if you don't skip several releases when upgrading) and invalid DOM nesting (which is just React becoming less messy).

Everything else seems to be the result of forcing a square peg in a round hole, i.e. using non-standard tooling. As much as I can understand the desire to have one set of tooling to rule them all, if I were using Ruby/Python/PHP/Java for my backend I wouldn't expect to be able to integrate my backend tooling with the Gulp/Grunt/WebPack I might use to build the frontend. Expecting to be able to do the same the other way around is just as arrogant, especially if you then blame the ecosystem you're trying to avoid for the incompatibilities you're experiencing.


Another version: "Doctor, it hurts when I do this"


Yeah, if you use a custom/unofficial build tool to compile code, and then the format changes, then your custom build tool is going to require updating. Blaming the React or javascript ecosystem for that seems silly when there are officially endorsed and supported tools that are kept up to date that you specifically aren't using.


Yeah, imagine a problem with the javascript build tool chain. That rarely happens, right? But I was unfamiliar with SBT - it's scala build tools[1].

[1] http://www.scala-sbt.org/


I think it's the default/recommended build tool for play framework projects.


I wonder why they can't just use node to transpile the react code?


Yowza, that's a big upgrade. We did 0.12 to 0.13 to 0.14, and it wasn't too bad since React had deprecation warnings that are pretty clear.


The comments about js build process with these new tools is something I share.. when you see the dependencies being pulled down and things are getting included my first reaction is... What am I shipping. It's good to raise that question in meetings too and see the developers sink back in their seats. change management no go to me when you're pulling in GitHub projects at each build and they could change


Why is the major version still zero?

AFAIK React is used in production, and I can not find any alpha/beta labelling on their website either?


They are about to jump from 0.14.7 to 15.0.0 for that reason.

More here: http://facebook.github.io/react/blog/2016/02/19/new-versioni...


React is now React 15 so everybody should expect more easy to upgrade versions moving forward.


I don't know, if they apply semver but just go from major version to major version, that's no change in policy. React 15 -> React 16 makes no BC guarantee.


The way the announcement was presented at React.js Conf this year suggests that the version shift also represents a change in stability. Personally I think that from now on React will treat major releases more like Node does -- but of course many people would still consider that pace anxiety-inducing.


True true, I am giving them the benefit of the doubt though.


Why? What did they change to make it easier to upgrade? (Obviously adding a 15 to the name doesn't change anything)


If you don't skip versions, you get deprecation warnings for most changes before the breaking change in a subsequent version... jumping 3 versions was the bigger issue, second being that 3 versions in 1.5 years... but that is the JS ecosystem in general right now.

Beyond this, tfa is fighting with build tools that are a bit of a mismatch, which is a separate problem... That it only took a day isn't such a big deal really, and surprisingly good... it could have been integrated a couple of times over the course of that 1.5 years, instead of being left to rot.


I read the comment as sarcasm.


I had upgraded from 0.13 to 0.14 along with redux and react-redux for fairly large project. I had expected i to be difficult, did ran into few issues but major ones were deprecation warnings. At the end it was far simpler then i assumed.


> stacktrace > an you spot the problem? Yeah, me neither.

I did spot the problem. But then I'm in the middle of making sbt behave civily with typescript and angular2.

Of course I could just use webpack or some other javascript buildtool. But I like to have integrated incremental compilation for frontend and backend. And since our backend is Scala I'm working in my weekends on a typescript sbt plugin. I guess I should get out more.


I love the trolly title :)

BTW, the real title is 374, not 347.


React is version 0?

Does that mean they don't follow semantic versioning, or that it isn't production-ready?


They were on major version 0 to give them flexibility in evolving the API, but considering it's been used in production at Facebook for some time and that the API is relatively stable, they are moving from 0.14 to 15.0 in the next release.


I moved away from React when I noticed how difficult it was to use raw elements (without the "Component" stuff). Also, I never knew when updates were taking place in the real DOM.


Because using raw elements is supposed to be the exception, not the rule... also `componentDidMount` and `componentDidUpdate` combined with refs is pretty straight forward[1].

I'm integrating PDF.js into a react app, and that has been anything but fun (still need to do more for performance concerns), but this is abnormal for react apps... More often than not, if you need to use refs a lot, and are doing a lot of DOM manipulation directly, odds are "you're doing it wrong" ... I'm not saying there are never times, but there are usually better and less cumbersome patterns.

One of the things I really like about react+redux, it it's closer to 95/5 than 80/20, and when you find yourself fighting the flow, usually the answer is to rethink the problem.

https://facebook.github.io/react/docs/component-specs.html


how could you not know?


Because componentDidMount only works for components, not elements.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: