Hacker News new | past | comments | ask | show | jobs | submit login

I had hard a hard Time understanding the fundamental idea. Could some one provide a TL;DR ?

My impression is that the versioning of ICE [http://www.zeroc.com], I mean the protocol, is the most effective one. The authors learned versioning and interoperability problem the hard way as previous implementors of CORBA.

In this model A version is composed of two numbers, the version and the release. When the version is different, code or protocol are incompatible. Code or protocol supporting one version is not required to support other versions. The release number is incremented at each change. When a code or protocol has release x, it MUST support all releases smaller than x as well, but only with same version. This is a very simple and clear rule.

It allows incremental development and evolution of code and protocol, while ensuring interoperability at the same time. This requires a coordinated version numbering and is not compatible with a pure bazaar development.




Fundamentally, a software package implements multiple contracts, that is, protocols and APIs. Each contract has versions. It's like your laptop implements USB3.0, Bluetooth 4.1, m-SATA, etc. You care a lot about the versions of the contracts your laptop (or apps) implement. You care less about the version of the laptop itself.

So the idea is to formalize this in the form of a document that lists the contracts, their version, and how far the software implements them. This can all be tested from the outside.

It's part of a more general vision of making software to implement contracts rather than to provide features.


What is the problem you are trying to solve ? It looks like it is zmq specific. If you decompose a library in different sub libraries, each one with its own version, it may seam that you allow each library to evolve independently. But you are really making things more complicate for the library users. You may create a dependency hell : https://en.wikipedia.org/wiki/DLL_Hell


The problem is that even small libraries end up with multiple public contracts that evolve independently. Trying to identify these with a single version number doesn't work. Allowing major version numbers to define interoperability is even worse. We don't see dependency hell in practice.


I understand what you mean, but for this to be manageable for the end user it should be automatized. On Debian or Ubuntu, when you install a package, all the dependencies are defined. apt-get downloads all related packages taking in account the dependencies. This could be generalized to smaller library units.

But I do like snapshot upgrades of Ubuntu because upgrading is a risky process that requires time. Geeks probably prefer the Debian way with a rolling upgrade. If something doesn't work they can fix it.

For users that are not tech savvy and don't want to fill their head with the versions and dependencies of all the packages the Ubuntu upgrade model is much simpler. The OS and all its packages do share one global version number.

I would say the choice depends on the public. Its not all black and white.


But I assume some versioning of the implementation is also required, no? After all, your laptop may claim to implement USB3.0 but has a bug which is particular to this laptop version.

BTW, this form of versioning is common in the Java world where APIs are standardised in the form of JSRs. An application server implements multiple JSRs, so every application server version lists what particular JSR (spec) versions it implements.


So yes, you'll want (a) good tests and (b) patches for e.g. the USB3.0 implementation. (I realize that's not how it works for hardware, so it's an analogy). What you don't want is a new release that may change X things because you hit a bug in one specific contract.

The JSR reference is interesting, I wasn't aware of it. How well does it work, and what are the problems with it? I assume it gets complex in cases.


It works OK. It doesn't get too complicated because spec versions are updated only once every couple of years. So far, the standardized specs (JSRs) approach has worked quite well. Every standard (and there are a lot of them[1]) that is ever complete (many don't) usually has at least three implementations, so often there's plenty of choice.

Here's an example from Jetty: http://www.eclipse.org/jetty/documentation/current/what-jett...

And here's Tomcat: http://tomcat.apache.org/whichversion.html

(Both are very popular web servers)

[1]: https://jcp.org/en/jsr/all


Isn't this more or less [SemVer](http://semver.org/)? A bit more streamlined from the official SemVer spec, but it's basically a guideline anyway.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: