I didn't get that from the article at all. I actually like what the article is doing a lot, and I wish people would take the same approach more often. When arguments about high-level concepts like inheritance go poorly, it's usually because everyone involved is talking about something slightly different. Maybe a supporter of inheritance likes the way it lets them think about their program (ontological inheritance) while a detractor doesn't like how it forces each subclass to carry along the baggage of its superclass (implementation inheritance). These people are not going to have a fruitful discussion without understanding that they're talking about different things.
The concept of "types" in programming languages is another great example -- there are syntactic type declarations, memory-level types to specify which bytes mean what, things like typeclasses or interfaces which give you runtime polymorphism, the "type" you have in your head when you're thinking about what kind of data your code needs to handle... before you even start to have a discussion, you need some idea of what you're talking about.
The problem balancing implementation vs ontological inheritance while providing strong type safety is that your language needs to either support defining contravariant types (and you still likely end in a mess, see Scala's eternal discussion about "total" type safety), or you disallow ontological inheritance completely (like Golang, and therefore cannot support many modern programming features, for the better or worse, doesn't matter). Not sure if there is a language that truly does the opposite by design, though (being strongly typed and disallowing implementation inheritance, while providing ontological inheritance). Such a language might be the DDD modeller's heaven? :-)
> or you disallow ontological inheritance completely (like Golang, and therefore cannot support many modern programming features
Sincerely, what features does this preclude. I pretty much ignore ontological inheritance in any programming language because it never seems useful. Am I missing something?
The concept of "types" in programming languages is another great example -- there are syntactic type declarations, memory-level types to specify which bytes mean what, things like typeclasses or interfaces which give you runtime polymorphism, the "type" you have in your head when you're thinking about what kind of data your code needs to handle... before you even start to have a discussion, you need some idea of what you're talking about.