Hacker News new | past | comments | ask | show | jobs | submit login

>I've been in teams where pull requests that significantly cleaned up the code base were literally rejected, because people were paranoid that any change at all could break something in unforeseen ways. People just bunkered up into a "if it ain't broke, don't touch it" mentality, which meant that the tech debt problem never ever improved.

Was this in a project with or without unit tests? Usually you can win over people like this by writing a test harness around the area affected, getting that through their desks, and then pushing for change.

If even that is rejected you take it upstairs and make an ultimatum or straight up quit and pat yourself in the back for a job well done, regardless of which one you picked.




It's usually not simple to add unit tests to a project that weren't built with them.

Sometimes the unit tests are the technical debt...


There are other ways to test legacy code rather the applying unit testing. If no testing is present at all, unit testing is not likely the correct approach and function/integration testing (using a PhantomJS for example) might be more suitable.


None of the options being discussed are "simple". There isn't a simple way out of technical debt. If there were, it wouldn't be debt.

Unit tests rarely contribute much to the technical debt, because worst generally-seen case is that you can just throw them away. I've seen that, unit tests so bad they weren't usable for much. I'm yet to see a code base destroyed by trying to make it testable, whereas I've seen a ton of codebases that were made quite powerful and flexible, yet also fairly pleasant to use, because they were built to be testable.

Also, in the end I'm far more interested in "automated" testing than anybody's pedantic definitions of what "unit" or "integration" or whatever testing is. That's not the point. The point is that I ought to be able to set up an automated built server and run useful, meaningful tests on it.


Tests are great as long as there's a strict discipline (you usually need to enforce this with tooling) of never committing a change that causes any tests to fail.

Then anyone who writes a test has to make it pass, and anyone who makes a change that breaks a test has to fix it. No rotting tests that passed long ago and have been failing for months that everyone ignores because whatever


I once encountered old code where many of the functions had an argument "bool isUnitTest". Counting the number of things that have to be wrong with the workplace in order for this to occur and to be acceptable is left as an exercise for the reader.

Worst codebase I have ever seen. The incumbents had zero interest in paying down technical debt. Unit tests were just another thing that could be used to game metrics and justify budgets.

Metaphorically speaking, the group issued predatory high-interest payday loans to keep the customer paying increasing amounts of interest on a growing amount of technical debt, forever.


I found that in some of our code the other day, even that wasn't enough to stop the tests from randomly failing.


I have been in such a project (nuclear waste recycling). All the refactoring have been postponed until the Y2K project. All the tests had to be performed for this project. This project should have been boring (very few code was involving date), but because of this huge refactoring, it was interesting. We have removed almost all the technical debt.


This approach will fail.

Unit tests are by their very nature tests that are tightly coupled to your code.

Tight coupling is usually the biggest component of your technical debt.

You want tests that are more loosely coupled - integration tests.


Often really awful code is covered by really awful tests.


Also without mutation testing you can't have any real confidence in the quality of your tests to begin with, if they exist at all.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: