The biggest problem facing humanity today isn't the singularity or global warming. It is that we don't know how to grow successful things without making them suck. For want of a better name, I tend to call this gravity-like tendency 'the tragedy of the commons'. But I think I'm interpreting it more broadly than is conventional[1]. Some examples that aren't generally associated with the idea:
a) The patent system. It has decayed because it's an externality to be dumped to, in the interest of individuals to overload and to game[2].
b) The cat signal[3]. We have a tendency to flatten nuanced subjects (apathy) into flat memes (here lolcats) and fight them with a single weapon (here blackouts[4]) that grows ineffective with use. Blackouts are an externality; people will eventually grow sick of them. Everyone knows this. So the game-theoretic optimum strategy is to exploit them as quickly as possible before they're milked dry.
c) Code becomes an externality as the collaborators grow. Think of the last time you saw someone bolt an argument onto a function. The number of arguments isn't 'important'. It doesn't affect anyone's performance metrics. Everyone has a vision for the codebase, needs to mould it somehow to further his ambitions. The codebase becomes a mute vehicle for everyone's ambitions, and as such an externality to be dumped to. Bolting arguments onto functions is merely the smallest sin that can be committed[5].
There's elements of regulatory capture here[6]. I think the two together serve to explain the fall of Rome, the rise of bureaucracies and the decay of the reddit frontpage.
But the problem is solvable. I think the solution will come from software. But not the kind of software we tend to write today. Something is deeply broken about how we write software, and I've been trying to tease out what it is. My current hypothesis is that the enemy is abstraction. We tend to prematurely freeze interfaces. Everytime you freeze an interface in your code you give up ownership over its future evolution. You can't take things out, so it's going to get polluted over time, get things bolted on. It immediately turns into an ugly step child of your otherwise beautiful code. And so we neglect interfaces, treat them as externalities, bolt features on, make simple interfaces ugly and complex. Interfaces are like walls separating jurisdictions. Neither jurisdiction cares about the wall.
Every now and again someone says, "this interface is too complex; I'm going to simplify it". And creates a new simple interface. But they then commit the Great Error: they freeze their interface in its turn, and the cycle repeats with it. All we've done is add another layer of crap atop all our layers[7]. As we add layers on top the bloat compounds. I think if you truly looked at the tower of abstractions in your codebase today, and took out everything that didn't serve your use today, you'd end up with a codebase two or three orders of magnitude smaller.
I've been exploring replacing backwards compatibility with unit tests. Imagine a world where code sharing took place but with no guarantees for the future. This version here makes certain guarantees, but if you upgrade or do a git pull, all bets are off. It might delete a function you rely heavily on. Or it may make it work differently one time in a thousand[8] and cause subtle, insidious bugs. The only way for you to guard against this is by having a comprehensive suite of unit tests. You have to be certain that if the tests pass the software is safe for public consumption. It's a very different discipline; the wall can now move, and both sides watch its evolution closely using the barbed wire of tests. But it could lead to far smaller codebases, and our lives as software developers would be less circumscribed. Every interface we have to support is a line drawn through the state space constraining our movement, the kinds of software we could write. It's a wall limiting our jurisdiction. Fewer interfaces imply fewer walls, greater freedom of movement across mental terrain, better software.
And if our software is better, less hamstrung, more responsive, we may be able to automate more. We may be able to think about all the dysfunctions in the real world, constantly seek new externalities and create sensors to monitor their health.[9]
[9] But we have to guard against tying everything to a single metric. The last thing I want is a manager reading this and deciding to flatly limit the number of arguments to a function by diktat. The moment you create a metric you encourage people to stop exercising judgement and simply game the metric. Metrics to fight externalities can themselves become externalities.
I'm reading your thesis as "stable APIs are bad because it's hard to get rid of cruft, and if we just test enough, stability won't matter". As a net-consumer of APIs, I'm not sure I can get behind that.
Of course, an API can be expected to change rapidly early in its life. Such APIs are usually not considered ready for serious production use and breaking changes aren't really a big deal. Later on, people tend to expect more stability out of APIs and tend to stop using (or if that's not possible, post strongly-worded rants to HN about) APIs that remove features unexpectedly.
There are extreme cases to be sure. Microsoft is famous for keeping new versions of Windows bug-for-bug compatible with the past. Apple, on the other hand regularly deprecates and removes features that see little use. In light of Steve Yegge's recent post about liberal and conservative attitudes in software, I would call Microsoft's position on this issue very conservative and Apple's centrist.
I think what the world you're envisioning would actually look like is massive fragmentation, with people maintaining many more forks of libraries that have since removed some important functionality.
That's not to say the problem you describe doesn't exist or that there aren't solutions. An appropriate deprecation policy is certainly one part. Another component could be wider use of dynamic binding to help keep certain code out of APIs and in clients. A third could be more effort to keep APIs simple in the sense that Rich Hickey uses the word.
Yeah, I should clarify: I'm a lot less certain of the solution than of the problem.
I was implicitly assuming some preconditions: a) You're a programmer trying to solve a problem, b) You have access to the source code of your dependencies. It might be a web API, but you must be able to setup your own server to service the API. If you can't fork a project my points are irrelevant.
If you can in fact fork, my idea is to explicitly deemphasize eco-system health and fragmentation in favor of just keeping your integrated stack clean. I think it's an approach worth trying out.
OSS people love to say, "you have the source, you can change it to do what you want." On the one hand it ignores that it takes more than just the sources to accomplish something in the presence of real-world constraints (time, resources). On the other hand, when you do take the steps to change it to do what you want there's a negativity associated with 'forking a project'.
I want to raise a counter-point that encourages people to fork projects rather than trying to work around issues with hacks atop black-box dependencies. This does already happen in the real world. Ubuntu does patch packages rather than wait for upstream to accept them. I think there's benefit to more people trying this in all parts of the eco-system.
The eco-system would be better off if projects were partly chosen based on how encouraging they are of forking. Sometimes the appropriate response to a patch may be, "thanks, this is great, but it's a little outside our ambit, so why don't you fork the project?" And baking this choice into the workflow would encourage simpler architectures that are easier for others to understand and start forking.
a) The patent system. It has decayed because it's an externality to be dumped to, in the interest of individuals to overload and to game[2].
b) The cat signal[3]. We have a tendency to flatten nuanced subjects (apathy) into flat memes (here lolcats) and fight them with a single weapon (here blackouts[4]) that grows ineffective with use. Blackouts are an externality; people will eventually grow sick of them. Everyone knows this. So the game-theoretic optimum strategy is to exploit them as quickly as possible before they're milked dry.
c) Code becomes an externality as the collaborators grow. Think of the last time you saw someone bolt an argument onto a function. The number of arguments isn't 'important'. It doesn't affect anyone's performance metrics. Everyone has a vision for the codebase, needs to mould it somehow to further his ambitions. The codebase becomes a mute vehicle for everyone's ambitions, and as such an externality to be dumped to. Bolting arguments onto functions is merely the smallest sin that can be committed[5].
There's elements of regulatory capture here[6]. I think the two together serve to explain the fall of Rome, the rise of bureaucracies and the decay of the reddit frontpage.
But the problem is solvable. I think the solution will come from software. But not the kind of software we tend to write today. Something is deeply broken about how we write software, and I've been trying to tease out what it is. My current hypothesis is that the enemy is abstraction. We tend to prematurely freeze interfaces. Everytime you freeze an interface in your code you give up ownership over its future evolution. You can't take things out, so it's going to get polluted over time, get things bolted on. It immediately turns into an ugly step child of your otherwise beautiful code. And so we neglect interfaces, treat them as externalities, bolt features on, make simple interfaces ugly and complex. Interfaces are like walls separating jurisdictions. Neither jurisdiction cares about the wall.
Every now and again someone says, "this interface is too complex; I'm going to simplify it". And creates a new simple interface. But they then commit the Great Error: they freeze their interface in its turn, and the cycle repeats with it. All we've done is add another layer of crap atop all our layers[7]. As we add layers on top the bloat compounds. I think if you truly looked at the tower of abstractions in your codebase today, and took out everything that didn't serve your use today, you'd end up with a codebase two or three orders of magnitude smaller.
I've been exploring replacing backwards compatibility with unit tests. Imagine a world where code sharing took place but with no guarantees for the future. This version here makes certain guarantees, but if you upgrade or do a git pull, all bets are off. It might delete a function you rely heavily on. Or it may make it work differently one time in a thousand[8] and cause subtle, insidious bugs. The only way for you to guard against this is by having a comprehensive suite of unit tests. You have to be certain that if the tests pass the software is safe for public consumption. It's a very different discipline; the wall can now move, and both sides watch its evolution closely using the barbed wire of tests. But it could lead to far smaller codebases, and our lives as software developers would be less circumscribed. Every interface we have to support is a line drawn through the state space constraining our movement, the kinds of software we could write. It's a wall limiting our jurisdiction. Fewer interfaces imply fewer walls, greater freedom of movement across mental terrain, better software.
And if our software is better, less hamstrung, more responsive, we may be able to automate more. We may be able to think about all the dysfunctions in the real world, constantly seek new externalities and create sensors to monitor their health.[9]
---
Related (if you squint a little): http://www2.macleans.ca/2012/06/11/artisan-chocolate-and-soc...
[1] The traditional motifs (http://en.wikipedia.org/wiki/Tragedy_of_the_commons) are both more concrete and more libertarian. I don't think the tragedy can be solved by eliminating all commons.
[2] http://akkartik.name/blog/2010-12-19-18-19-59-soc
[3] http://internetdefenseleague.org
[4] Another example is our overuse of anti-biotics.
[5] No judgement; I'm as guilty of this as anyone can be.
[6] http://en.wikipedia.org/wiki/Regulatory_capture
[7] Think of the xkcd on standards: http://xkcd.com/927
[8] http://news.ycombinator.com/item?id=4295681
[9] But we have to guard against tying everything to a single metric. The last thing I want is a manager reading this and deciding to flatly limit the number of arguments to a function by diktat. The moment you create a metric you encourage people to stop exercising judgement and simply game the metric. Metrics to fight externalities can themselves become externalities.