Hacker News new | past | comments | ask | show | jobs | submit login

I think developers need to learn to build to last. We have medical software that hasn’t been updated for two decades that runs just fine.

That’s often unobtainable with modern software development because we rely so much on things that change too often, but it doesn’t have to be that way.

It’s a paradigm shift of course, but I think our business really needs to take maintainability more seriously than it does. This goes for proprietary software as much as Open Source, but with Open Source there is the added layer of governance.

I work in the public sector and we operate quite a few Open Source projects with co-governance shared between different muniplacities, and the biggest cost is keeping up with updates to “includes”. It’s so expensive that our strategy has become to actively avoid tech stacks that change too often.




While I don't disagree that we shouldn't strive for maintainbility, things like medical software, airplane software or similar highly tested mission critical pieces are specifically built to last for that long. Nobody is going to pay us to build a WebShop to last for 20 years, thats just not a necessity when getting it out quick is so much more important from a business perspective than making it last forever.

> That’s often unobtainable with modern software development because we rely so much on things that change too often, but it doesn’t have to be that way

The reason we rely on things that change often is because we want to leverage them to get products out faster. Many different layers of that (as every tech stack is essentially a product by someone) and we have lots of updates to deal with. The flipside of slow moving projects is bugs might not be fixed or new helpful features might not be coming in, meaning you have to build it yourself.

As a community we know and have known how to build mission critical software for decades, but we actively often decide not to do it because it isn't that important compared to other factors.


So interestingly - the web shops you’re talking about do want to maintain their client data, and do expect it to be available “forever,” somewhere. The payment processors absolutely do at a minimum. Some of those layers are highly hardened.

While the particular Etsy clone or t-shirt of the day, or customized shower curtain site will certainly come and go, it’d be an entirely different problem if visa, PayPal, stripe, swipe, or whatever payment processor packed it up and went home at random.


We need the foundations/infrastructure to be built to last. People need to identify which kind of software they're making and treat the infrastructure as unchanging. Changes in the basement need to be carefully considered with a default stance of rejecting them unless justified by reasoning that has a time horizon of many years.


Tell that to the number of shops setting up factory IoT based on Node.js...


I have to be honest I do believe Node.js will be around for a long time. The improvements over the past few years have been vast thanks to the ever improving standards and all the major cloud companies are heavily invested. It has the world's largest public package registry as well (not that the sheer quantity means you can always find a high quality library).

I've only recently switched after years of scepticism but for the sort of stuff I do it's more than good enough. It has its warts but so does every language that's stuck around.


I don't necessarily think that the the language or the core thing won't stick around. But unless you force people to really decide which packages they require, you end up with an unaudited mess of packages (basically with every package - or is anyone really creating portable, stable apps of a relevant size based on the core node environment ...)


Yeah for sure the language definitely makes it easy to build something that won't last. I misunderstood your comment so sorry about that.


I agree with you in principle. But it should be noted that any software that interacts with the world outside of itself can't be considered to be in good working order if it hasn't been audited and updated to resist security vulnerabilities.

I'd argue that medical software shouldn't be connected to networks because security is hard, and most people get it so wrong. If that's part of the design, then the goal you're talking about is attainable. But in many cases, software isn't useful for its purpose if it can't access a network, and so the idea of just leaving it alone for decades at a time is an actively bad goal.


You’re absolutely right, but we also operate Django applications that hasn’t needed anything but the occasional security update in a lifespan that is longer than the existence of React.js.

I like react by the way, it’s just an example. But we’ve certainly had to spend a lot of dev time on JS frameworks in general.


Most “runs fine after 20 years” software is really “security nightmares that people are affraid to touch. Great designs and forward thinking are helpful, but “code and walk away” just isn’t the world we live in.

The new paradigm has to be “plan to evolve with the ecosystem.” There are just too many moving parts to treat software as static.


None of our old software that was build to last has security issues.

I know it’s harder to build with security in mind in the modern connected world, but we have a Django app that hasn’t needed anything but security updates that runs perfectly fine as an example of a web-app that doesn’t need much development time post implementation. So it’s not like it’s impossible either.

Don’t get me wrong, we’ve been as guilty of “wow this new tech is cool” as anyone else, which is where the lessons come from.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: