Hacker News new | past | comments | ask | show | jobs | submit login

The present OT is very much an epicycle theory. They always have that one last issue to fix. In fact, several, if not the most of, classic OT papers were later found to be inconsistent, e.g. some combination of concurrent edits makes copies diverge (source: P. Molli). And that problems are extremely hard to find/understand. Apparently, the original authors of OT had a local network in mind: universal connectivity, instant message propagation, etc. But the internet is full of "relativistic" effects: different clients have different views of the state, messages propagate with finite/varying speed, etc etc. After reading the article, try to estimate what happens if e.g. one client syncs hourly/daily/monthly (like in the git), or suppose we don't have a central server (like in git), or if the stream of edits is too dense... like we really have a hundred of monkeys... Maybe Google engineers managed to get the probability of OT screw-ups lower than the probability of all other screw-ups... but that does not make the theory any better.

(...and the way they dealt with XML is a very special song. OT is pretty much a complexity explosion itself, but they also multiplied it by the complexity of XML; got 15 kinds of mutation operations and all the stuff. The fact they invented "annotations" to skip XML nesting speaks for itself.)

Conclusion. OT is a big messy mess.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: