That is true, although one of the things EC has to be careful about is when the mutation history of a key results in different values if it gets out of order.
If you partition and rejoin, even knowing all the time stamps can make it hard to re-assemble.
But the basic thesis that EC is not appropriate for all data models is certainly valid. I certainly wouldn't want my bank to use such a model for reconciling transactions, they screw up enough as it is.
Right; you need some notion of a total order or commutativity in your update functions. f(A,B) = f(B,A). "Last writer wins" is one example of commutativity, but that isn't often what you want.
What you want is something like a "Commutative Replicated Data Type" [1], where you define a commutative function specific to your application. Libraries like StateBox allow you to build CRDTs [2]. In fact, your example of document editing was one of the areas where these ideas first came up.
There's also a theorem saying that if your program is "logically monotonic"--that is, if your data only "grows" in size, and facts never change from True to False or vice versa--then your program will work under eventual consistency without modification [3].
Finally, bank accounts have to employ eventual consistency. Banks demand availability from their ATMs and give up consistency to handle partition tolerance: your ATM will still work even if you sever its network connection. However, banks, unlike a lot of software, have well-defined compensation protocols for handling inconsistency. For example, they'll charge you for negative balances left in your account via overdraft fees.
Consider the example value :
vs an out of order version: If you partition and rejoin, even knowing all the time stamps can make it hard to re-assemble.But the basic thesis that EC is not appropriate for all data models is certainly valid. I certainly wouldn't want my bank to use such a model for reconciling transactions, they screw up enough as it is.