The example is hilariously terrible. Firstly, this is the currently required code:
def set_deadline(deadline):
if deadline <= datetime.now():
raise ValueError("Date must be in the future")
set_deadline(datetime(2024, 3, 12))
set_deadline(datetime(2024, 3, 18))
There simply is no trade-off to be made at this point. Perhaps there will be eventually, but right now, there is one function needed in two places. Turning two functions that already could be one into a class is absurd.
Now, as far as teaching best practices goes, I also dislike this post because it doesn't explicitly explain the pros and cons of refactoring vs not refactoring in any detail. There is no guidance whatsoever (ie: Martin Fowler's Rule of Three). This is Google we're talking about, and newer developers could easily be led astray by nonsense like this. Addressing the two extremes, and getting into how solving this problem requires some nuance and practical experience is much more productive.
Almost all programming tutorials and even books to a certain extent suffer with the problem of terrible examples. Properly motivating most design patterns requires context of a sufficiently complex codebase that tutorials and books simply do not have the space of getting into. This particular case is especially bad, probably because they had the goal of having the whole article fit in one page. ("You can download a printer-friendly version to display in your office.")
> There is no guidance whatsoever (ie: Martin Fowler's Rule of Three).
That is completely unfair imo. Although not properly motivated, the advice is all there. "When designing abstractions, do not prematurely couple behaviors that may evolve separately in the longer term." "When in doubt, keep behaviors separate until enough common patterns emerge over time that justify the coupling."
Simplified maxims like "Rule of Three" do more harm than good. Don't couple unrelated concerns is a much higher programming virtue than DRY.
> Properly motivating most design patterns requires context of a sufficiently complex codebase
As someone that's made a best selling technical course, I strongly disagree.
It's 100% laziness and/or disregard for the reader.
The reason examples are as bad as they are is that people rush to get something published rather than put themselves in the audience's position and make sure it's concise and makes sense.
It's not like webpage space is expensive. There's plenty of room to walk through a good example, it just requires a little effort.
It is not the webpage space. It is people's limited attention spans and ability to focus. A complex example is needed to properly motivate certain concepts, but too complex an example also contains too many other details that the reader gets bogged down/distracted from the main concept being discussed.
At least that is my hypothesis for why almost all programming books and tutorials have terrible examples. I am happy to be proven wrong.
Coming back to the article, I looked at some of the previous articles from the same series, and to me it feels like a very conscious decision to only include 3-4 line code examples.
> It's not like webpage space is expensive. There's plenty of room to walk through a good example, it just requires a little effort.
Right at the top of the page:
> A version of this post originally appeared in Google bathrooms worldwide as a Google Testing on the Toilet episode. You can download a printer-friendly version to display in your office.
What does sales have to do with what you're claiming? Please share the course and or examples of it being done well without requiring that excessive context, so that there's something to support your claim.
Not related to the topic at hand, but who buys these courses? Going off the chapter titles it looks like it’s all basic ‘read the documentation’ kind of stuff (to me). I could imagine it being useful to beginners, but not anyone with a moderate amount of experience (they’d just go to the Neo4j documentation).
On the other hand, what beginner starts with Neo4j and Cypher? Is there really enough of them to justify a whole course? Apparently there are, it just feels weird to me.
You're right in that if you go through the docs you can find all the info you might need.
It's really catered for beginners, people that have next to no knowledge of graph databases or Neo4j and want to get up to speed in just a few hours.
I imagine some people may not even be super technical, but may want to learn just the basics of querying a DB at work to get some basic info out of it.
Apart from lessons there are also exercises for people to practice what they just learnt, and I do my best to point out gotchas and keep it mildly entertaining with a gentle progression in difficulty.
I was going to say you were talking nonsense, but then realized I’d replaced the original post in my mind, by this much nicer post that someone else linked in this thread:
Your example, deduplicating the two functions into one, illustrates an interesting point, although I'd prefer still having the two specialized functions there:
def set_deadline(deadline):
if deadline <= datetime.now():
raise ValueError("Date must be in the future")
def set_task_deadline(task_deadline):
set_deadline(task_deadline)
def set_payment_deadline(payment_deadline):
set_deadline(payment_deadline)
set_task_deadline(datetime(2024, 3, 12))
set_payment_deadline(datetime(2024, 3, 18))
You lose absolutely nothing. If you later want to handle the two cases differently, most IDEs allow you to inline the set_deadline method in a single key stroke.
So the argument from the article...
> Applying DRY principles too rigidly leads to premature abstractions that make future changes more complex than necessary.
...does not apply to this example.
There clearly are kinds of DRY code that are less easy to reverse. Maybe we should strive for DRY code that can be easily transformed into WET (Write Everything Twice) code.
(Although I haven't worked with LISPs, macros seem to provide a means of abstraction that can be easily undone without risk: just macro-expand them)
In my experience, it can be much harder to transform WET code into DRY code because you need to resolve all those little inconsistencies between once-perfect copies.
I can only assume the Google example would be part of a script/cli program that is meant to crash with an error on a bad parameter or similar. Perhaps the point is to catch the exception for control flow?
My personal goal is to get things done in as few lines of code as possible, without cramming a bunch on one line. Instead of coming up with fancy names for things, I try to call it by the simplest name to describe what it's currently doing, which can be difficult and is subjective.
If we wanted to define a function which crashes like the example, I would probably write this:
def throw_past_datetime(dt):
if dt <= datetime.now():
raise ValueError("Date must be in the future")
If the point is not to crash/throw for control flow reasons, I'd write this in non-cli/script code instead of defining a function:
dt = datetime(2024, 5, 29)
if dt < datetine.now():
# Handle past date gracefully?
If it needs to do more in the future, I can change it then.
>You lose absolutely nothing. If you later want to handle the two cases differently, most IDEs allow you to inline the set_deadline method in a single key stroke.
Problem with unintentional coupling isn't that you can't undo it. It is that someday someone from some other team is going to change the method to add behaviour they need for their own use case that is different from your own and you won't even notice until there is a regression.
in this case (which shouldn't happen because it requires that you merged things that don't belong together - see accidental duplication), at least the one changing the method has all information on his hands and doesn't have to keep a potentially complex graph of copied code in his mind.
Now, as far as teaching best practices goes, I also dislike this post because it doesn't explicitly explain the pros and cons of refactoring vs not refactoring in any detail. There is no guidance whatsoever (ie: Martin Fowler's Rule of Three). This is Google we're talking about, and newer developers could easily be led astray by nonsense like this. Addressing the two extremes, and getting into how solving this problem requires some nuance and practical experience is much more productive.