I agree. But also at the root of it is that a problem can't be escalated such that a thinking human that has actionable power can be involved.
The fallacy here is a belief that the filter is perfect. Or really, that any process can be perfect. Even if one could be perfect at a specific moment in time, well time marches on and things change.
I'm all for automation but it has to be recognized that the thing will always break and likely in a way you don't expect. Even in ways you __couldn't__ expect. So you have to design with that failure in mind. A lot of these "Falsehoods Programmers Believe About <X>" could summarized as "Programmers Believe They Can Accurately Predict All Reasonable Situations". I added "reasonable" on purpose. The world is just complex and we can only see a very limited amount. The best way to be accurate is to know that you're biased, even if you can't tell in which way you're biased.
To be more specific, I don't think there's a inherent lack of resources, but rather it is about resource allocation. There's also a big difference if we're talking about a startup vs a big tech giants (or even non tech focused companies). In the latter, there's no doubt that the resources exists, so it has to be allocation. And this is definitely true w.r.t. the above examples.
The fallacy here is a belief that the filter is perfect. Or really, that any process can be perfect. Even if one could be perfect at a specific moment in time, well time marches on and things change.
I'm all for automation but it has to be recognized that the thing will always break and likely in a way you don't expect. Even in ways you __couldn't__ expect. So you have to design with that failure in mind. A lot of these "Falsehoods Programmers Believe About <X>" could summarized as "Programmers Believe They Can Accurately Predict All Reasonable Situations". I added "reasonable" on purpose. The world is just complex and we can only see a very limited amount. The best way to be accurate is to know that you're biased, even if you can't tell in which way you're biased.