> Could it be that best practices are designed to make sure mediocre programmers working together produce decent code?
Yes, it is, but the issue is that the industry should move away from the idea that software can be done in an assembly line. It is better to have a few highly qualified people capable of writing complex software than a thousand mediocre programers that use GoF patterns everywhere.
I'm a security analyst, not a developer so please forgive my ignorance, but how do you become one of the highly qualified coders without first spending some time being a mediocre developer?
You don't. There's no way around having to spend some time in the trenches --- but we can at least minimize the amount of time developers spend in the sophomoric architecture-astronaut intermediate phase by not glorifying excess complexity. Ultimately, though, there's not a good way to teach good taste except to provide good examples.
"Best" practices may be a misnomer, but I don't believe it's possible to execute large projects without some kind of standardization. It is inevitable that in some cases that standardization will hinder the optimal strategy, but it will still have a net positive impact. Perhaps if we started calling it "standard practices" people would stop acting like it's a groundbreaking revelation to point out that these practices are not always ideal.
I think it's more that the best practices were adopted for specific reasons, but they are not understood or transmitted in a way that makes those reasons very clear. That is, a 'best practice' tends to solve a specific kind of problem under a certain set of constraints.
Nobody remembers what the original constraints are or if they even apply in their current situation, even if they are actually trying to solve the same problem, which they might not be.
This spirals as well: I've spent quite a lot of time recently helping people at the early stages of learning programming, and it's taken me a while to stop becoming frustrated with their misunderstandings. Sometimes it is just because something basic isn't clicking for them, but a lot of the time it's down to me trying to explain things through a prism of accumulated patterns that are seen as best practise/common sense, but stepping back and viewing objectively are opaque and sometimes nonsensical outside of very specific scenarios. There's a tendency to massively overcomplicate, but you forget very quickly how complicated, then you build further patterns to deal with the complexity, ad infinitum
Yep the biggest issue we have with software is every developer knows Single Responsibility Principle, and no developers know why. Everyone knows decouple and increase cohesion, but few any know what cohesion is.
That's it. "Best practices" is, essentially, coding bureaucracy. That's not a pejorative; bureaucracy is quite necessary.
I have a rough idea of "why OO?" but in practice, it can be pretty hard on things like certain kinds of scaling, projects that require a goodly amount of serialization/configurability and the like.
There is a spectrum of "it just works" to "I've applied every best practice" bad programmers who spend huge amounts of time on best practices will end up wasting a lot of time over optimizing but it may have the benefit of reducing risk where they may have a lack of deep understanding or it may shine light on that risk, that is if they apply practices correctly.
You're ignoring how "best practices" frequently add negative value. Design style isn't a trade-off between proper design and expedience. It's a matter of experience, taste, and parsimony.
Could it be that best practices are designed to make sure mediocre programmers working together produce decent code?
After all, actual novice programmers write code similar to the best programmers except that it doesn't work.