I disagree that the "Clean X" books are a waste of time. They lay a nice ground understanding of what to aim for when writing code, in particular when you're early in your career.
When I was starting as a professional coder years ago, I had an intuitive sense of what good code was, but I had no idea how much actual thought had been put to it by other people. Reading those books was a good step in seriously starting to think about the subject and look at code differently as a craft ("it's not just me, this code smells!" or "hey that's a neat idea, better keep this in mind").
Definitely would recommend to someone starting out their career.
Edit: getting downvoted for a reasonable, justified opinion. Classy.
Don’t know about the rest of the series, but Clean Code isn’t merely a waste of time, it’s worse — it’s actually a net negative, and lies at the root of a number of problems related to incidental complexity.
Not GP but: Personally, I find that book's advice is highly subjective and rooted on aesthetics rather than pragmatism or experimentation. It encourages an excessive number of very small methods and very small classes, and brushes off problems that it causes.
Not about the book, but: Its influence is malignant. Even Uncle Bob mentioned in a recent interview that he will break the "10 lines per method" rule if need be. But practitioners influenced by the book lack his experience, and are often very strict. I even remember a specific Ruby linter that capped methods at 5 or 6 lines max IIRC. Working in such a fragmented codebase is pure madness. This comment from another user made me remind some of those codebases: https://news.ycombinator.com/item?id=42486032
EDIT: After living in the "Clean Code world" for half a decade I can categorically say that it produces code that is not only slow to run (as argued by Casey Muratori [1]), but also slower to understand, due to the jumping around. The amount of coupling between incestuous classes and methods born out of "breaking up the code" makes it incredibly difficult to refactor.
I think people get hung up with the small classes/methods and ignore all the rest. One important lesson being that the aesthetics do matter and you have to pay attention to writing maintainable code. These are important lessons for a beginning developer. If you think otherwise, you've never worked on a code base which has 300 line functions with variables named temp, a and myVar.
Regarding short functions: yes, having them too short will absolutely cause problems. And you should not use this as an absolute rule. But when writing code it's very useful to keep this in mind in order to keep things simple - when you see your functions doing 3 independent things, maybe it's time to break ivt in 3 sub functions.
Edit: I see some criticism concerning too small classes, class variables being used as de facto global variables and shitty inheritance. Fully agree that these are plain bad practices stemming from the OOP craze.
Sure, but nobody is saying that aesthetics don't matter. Quite the opposite. People have been saying this for decades, and even government agencies have code-style guidelines. Also, the idea that big procedures are problematic is as old as procedural programming itself.
The problem is that, when it comes to aesthetics, one of the two more-or-less-novel ideas of the book (and the one that is followed religiously by practitioners) is downright problematic when followed to the letter.
> when you see your functions doing 3 independent things, maybe it's time to break it in 3 sub functions
That's true, and I agree! But separation of concerns doesn't have much to do with 10-lines-per-method. The "One Level of Abstraction per Function" section, for example, provides a vastly better heuristic for good function-size than the number of lines, but unfortunately it's a very small part of the book.
> I see some criticism concerning [...] class variables being used as de facto global variables
The criticism is actually about the book recommending transforming local variables into instance/object variables... here's the quote: https://news.ycombinator.com/item?id=42489167
If the 3 things are related such that they will only ever be called in order one after the other (and they are not really complex) it’s better to just do all the work together.
But this line of thinking is exactly what's wrong with Clean Code. Just seeing your function doing three independent things is not a signal that you should begin refactoring.
I've worked on code bases with functions that were longer than 300 lines with shorter variable names. Whether this is a problem is completely dependent on the context. If the function is 300 lines of highly repetitive business logic where the variable name "x" is used because the author was too lazy to type out a longer, more informative variable name, then maybe it's possible to improve the function by doing some refactoring.
On the other hand, if the function is an implementation of a complicated numerical optimization algorithm, there is little duplicated logic, the logic is all highly specific to the optimization algorithm, and the variable name "x" refers to the current iterate, then blindly applying Clean Code dogma will likely make the code harder to understand and less efficient.
I think the trick here is to cultivate an appreciation for when it's important to start refactoring. I see some patterns in when inexperienced developers begin refactoring these two examples.
In the first example, the junior developer is usually a little unmoored and doesn't have the confidence to find something useful to do. They see some repetitive things in a function and they decide to refactor it. If this function has a good interface (in the sense of the book---is a black box, understanding the implementation not required), refactoring may be harmful. They run the risk of broadening and weakening the interface by introducing a new function. Maybe they accidentally change the ABI. If you have only changed the implementation, if no one spends any time looking at the details of this function because it has a good interface, ... what's been gained?
In the second example, the junior developer is usually panicked and confused by a Big Complicated Function that's too hard for them to understand. They conflate their lack of understanding with the length and complexity of the function. This can easily be a sign of their lack of expertise. A person with appropriate domain knowledge may have no trouble whatsoever reading the 300 line function if it's written using the appropriate idioms etc. But if they refactor it, it now becomes harder to understand for the expert working on it because 1) it's changed and 2) it may no longer be as idiomatic as it once was.
One of the biggest issues with the book is that it is a Java-centric book that aspires to be a general-purpose programming book. Because it never commits to being either, it sucks equally at both. In much the same way, it's a "business logic"-centric book that aspires to be general purpose, so it sucks at both (and it especially sucks as advice for writing mostly-technical/algorithmic code). This is epitomized by how HashMap.java from OpenJDK[0] breaks almost every single bit of advice the book gives, and yet is one of the cleanest pieces of code I've ever read.
One fundamental misunderstanding in the book and that I've hear in some of his talks is that he equates polymorphism with inheritance. I'll forgive him never coming across ad hoc polymorphism as present in Haskell, but he book was published in 2009, while Java had generics in 2004. Even if he didn't have the terminology to express the difference between subtype polymorphism and parametric polymorphism, five years is plenty of time to gain an intuitive understanding of how generics are a form of polymorphism.
His advice around prefering polymorphism (and, therefore, inheritance, and, therefore, a proliferation of classes) over switch statements and enums was probably wrong-headed at the time, and today it's just plain wrong. ADTs and pattern matching have clearly won that fight, and even Java has them now.
Speaking of proliferation of classes, the book pays lip service to the idea of avoiding side-effects, but then the concrete advice consistently advocates turning stateless functions into stateful objects for the sake of avoiding imagined problems.
One particular bugbear of mine is that I've had literally dozens of discussions over the years caused by his advice that comments are always failures to express yourself in code. Many people accept that as fact from reading it first hand, others you can clearly trace the brain rot back to the book through a series of intermediaries. This has the effrt of giving you programmers who don't understand that high-level strategy comments ("I'm implementing algorithm X") are incredibly information dense, where one single line informs how I should interpret the whole function.
Honestly, the list goes on. There's a few nuggest of wisdom buried in all the nonsense, but it's just plain hard to tell people "read this chapter, but not that, and ignore these sections of the chapter you should read". Might as well just advise against juniors reading the book at all, and only visiting it when they've had the time to learn enough that they can cut through the bullshit themselves. (At which point it's just of dubious value instead of an outright negative)
I think you are totally right. The clean X books are not a waste of time. I meant that in the sense of “start here, don’t delay this”. I would recommend: read aPoSD, then Clean X series, then again aPoSD ;)
There tend to be two camps with the Uncle Bob franchise as I see it:
Those that fall for the way he sells it, as the 'one true path', or are told to accept it as being so.
Those who view it as an opinionated lens, with some sensible defaults, but mostly as one lens to think through.
It is probably better to go back to the earlier SOLID idea.
If you view the SRP, as trying to segment code so that only one group or person needs to modify it, to avoid cross team coupling, it works well.
If you use it as a hard rule and worse, listen to your linter, and mix it in with a literal interpretation of DRY, things go sideways fast.
He did try to clarify this later, but long after it had done it's damage.
But the reality is how he sells his book as the 'one true path' works.
It is the same reason scrum and Safe are popular. People prefer hard rules vs a pile of competing priorities.
Clean architecture is just ports and adapters or onion architecture repackaged.
Both of which are excellent default approaches, if they work for the actual problem at hand.
IMHO it is like James Shore's 'The Art of Agile Development', which is a hard sell compared to the security blanket feel of scrum.
Both work if you are the type of person who has a horses for courses mentality, but lots of people hate Agile because their organization bought into the false concreteness of scrum.
Most STEM curriculums follow this pattern too, teaching something as a received truth, then adding nuance later.
So it isn't just a programming thing.
I do sometimes recommend Uncle Bob books to junior people, but always encourage them to learn why the suggestions are made, and for them to explore where they go sideways or are inappropriate.
His books do work well for audiobooks while driving IMHO.
Even if I know some people will downvote me for saying that.
(Sorry if you org enforced these over simplified ideals as governance)
When I was starting as a professional coder years ago, I had an intuitive sense of what good code was, but I had no idea how much actual thought had been put to it by other people. Reading those books was a good step in seriously starting to think about the subject and look at code differently as a craft ("it's not just me, this code smells!" or "hey that's a neat idea, better keep this in mind").
Definitely would recommend to someone starting out their career.
Edit: getting downvoted for a reasonable, justified opinion. Classy.