I absolutely agree with this, and I'm not looking forward to the near future of the internet, but it's inevitable. We will hit AOL 2.0, well a few of them. We'll have the Apple internet, the Facebook internet, the Google internet, etc. but it won't last.
The rein of aol was killed by stagnation and outside innovation, people seeing that there's more and better outside of aol, and the second round will probably die a similar death. People will start to see cool new things happening outside their silo, or get fed up with them and the silos will eventually fall. These trends seem cyclical, we go from mainframes and silos to personal computers and an open network, then back to the same mainframes, this time called the cloud, and the same silos. It's not gonna be a fun time for those of us who don't like the confines of the new internet, but the handful of us who care won't stop the inevitable.
Me, I still run an irc server because I can, because there's a million clients to choose from, and everyone can have their own interface, and the protocol is so simple a decent programmer can hash out their own in a day. Nothing has come close to it yet (xmpp was great in theory but too complex for its own good). If you want a pretty ui and emojis and images, well there's a client that can do it, if you just want text, plenty of those too.
I'll be over here riding out the inevitable with my own file sync, git server, chat server, web server, and making my internet how I want it to be, hoping that in time people will notice that you don't have to just accept your preferred silos interpretation of it.
None of my friends use IRC, 95% have never even heard of it. Most of my friends are on Facebook Messenger.
For more public conversations; a lot of that is now taking place on twitter. Twitter gets promoted by the media with hashtags at the start of TV programmes. I never see any reference to IRC or other protocols.
I'm sure people will get bored with these current silos and move on but I doubt they'll move to something more open. The financial incentive just isn't there to develop the protocols and clients for an open system, whereas there is clearly big money to be made by creating the next big silo.
Oh I don't disagree with you, I'm not delusional enough to think that people are going to move to irc, or that it'll become mainstream, it's just a bit saddening. But the thing is it doesn't really matter because irc is a protocol. I don't ever have to worry about support getting dropped because someone wants to monetize it differently. The protocols will never get closed down all of the sudden because it's not popular enough. I can keep running my server as long as I want, independently of any company's decisions. I got a few friends set up on it years ago, and new people join from time to time. It's versatile enough that I just run bitlbee and have access to a bunch of other chat services over irc, all in one client.
I think people will get tired of not being able to communicate between their silos and if they get locked in enough, won't want to leave their own service for another one. It happened with AIM/MSN/Yahoo messenger, for a little while everything was moving to xmpp. Your AIM account could talk to people on gtalk, etc. It was nice while it lasted.
I think something like Slack but aimed groups of friends rather than development teams.
I play chess and I ride a bike, I have clubs with both of those things and they currently use facebook to co-ordinate and it works well enough but slack would be considerably better I think.
IRC is awesome but it only really handles the talking to people side of things well, sharing files (well), setting events and calendars - the stuff slack does is a huge multiplier.
It's not just connecting that's the problem: How do you implement your own server for slack, without using slack?
Because if that's off the table, the rest of the discussion doesn't matter. Now, of course not everyone will write their own irc-server, or port one to a new architecture -- but without free and open source implementations, any alternative is just another dead end.
Considering slack is doing so much right, I wish they'd realize this, and just either open up slack, or publish a new free/open slack server implementation, along with protocols for federation etc.
As I understand it, many people are getting weary of having to choose between IRC -- which have a number of issues, not the least of which is its (lack of) security architecture (even with TLS bolted on), and "extensions" being limited largely to bots -- and XMPP which is over-engineered.
I suppose the natural next step is something that is to IRC/XMPP as JMAP[1] is to IMAP -- a lightweight protocol, probably json (or: capt'n'crunch)-based -- that tries to combine strengths from IRC/SILC and XMPP without being complex.
If I were running slack, I'd worry about if it was easier to "own" the new protocol, or to port slack to support it after someone else releases a Free alternative to slack.
In the mid-90s, email was offered as a selling point by ISPs. ("Sign up with us and get 1/10/unlimited email accounts!")
It's also one of the easiest online protocols to understand and an obvious benefit. So ISPs sold it to the public and business. And because all the ISPs were unique but needed to interoperate, there was no proprietary format.
AOL and CompuServe soon worked out that access to external email was a thing, so they built gateways which broke their users out of the walled gardens.
ISPs didn't do the same for IRC. It was always more of a nerd toy, which meant the public didn't know about it. AOL and CompuServe (etc) had proprietary chat, and there were a lot of chatalikes like ICQ, and eventually things like MS Messenger.
Without ISP promotion none of them made it to the mainstream.
I don't think Facebook and Apple can kill the web entirely. There's always an "AOL niche" for a simplified social Internet. FB lives there at the moment, but it's a vulnerable slot, and unless they break out into something new FB will be dead in a decade or so - if only because teens will grow up using something else, and will stay with that something else as they get older.
Twitter gets promoted by the media with hashtags at the start of TV programmes. I never see any reference to IRC or other protocols.
Sure, and once upon a time the media companies instead advertised their AOL keywords.
As with AOL in its heyday, though, the problem is fundamentally one of content: the silos are good at aggregating it, but terrible as platforms for producing it. And this generation of silos has shown no indication of being able to break that pattern: for back-and-forth chat or for short blurbs they work well, but otherwise people turn to sharing non-silo'd URLs.
The next generation "bellhead" crowd, with more security, tariffs on different transactions and control of the network are the folks with big $$$ behind them today.
Thanks for the article, which suggests the debate is centuries old:
"Bellheads use phrases like "rate control" or "traffic shaping" when talking about ATM. Traffic is something to be tamed, sorted, and organized. Even though hundreds of different connections end up multiplexed on a single fiber-optic pipe, every ATM connection is treated as if it has its own, narrower pipe..
Netheads, on the other hand, cheerfully admit they have no idea what traffic will look like next month. It is easier, they say, to have the packets fight it out among themselves than to try to force some kind of order on them. Traffic is unpredictable, spontaneous, liable to jump up in strange patterns..
This philosophical divide should sound familiar. It's the difference between 19th-century scientists who believed in a clockwork universe, mechanistic and predictable, and those contemporary scientists who see everything as an ecosystem, riddled with complexity. It's the difference between central and distributed control, between mainframes and workstations."
> I'll be over here riding out the inevitable with my own file sync, git server, chat server, web server, and making my internet how I want it to be
To me, this is exactly why we're not headed for an internet-solely-comprised-of-walled-gardens scenario. There will be plenty for sure, but the pathways will still be open and we netizens can connect whatever we like with them. This is why I am still excited about it, even after 20 years.
Well, it would revert to the old style "wild west" of the Internet where people who didn't understand the services - mass amounts of lay people - would simply stop using it or stay in sandboxed app/web environments.
Heh, this made me think of it as a sort of "Internet gentrification". A new hip service/site starts getting into the mainstream, the lay people move in, it becomes adapted for the lay people, and what made it hip in the first place moves out and on to other things. Doesn't fit perfectly, but I think it's an interesting way to look at it.
This is something I've noticed for a long time. Many communities have this occurrence.
usenet, 4chan, reddit, world of warcraft, the "internet" itself etc.
Any networked service can fall victim to this pattern. The original community can quickly get displaced and drown in the wake of rushing users and the powers that be will respond to the will of the majority thus betraying it's original intent.
> The rein of aol was killed by stagnation and outside innovation, people seeing that there's more and better outside of aol, and the second round will probably die a similar death.
But now things are very different. Instead of just one one AOL silo, we have many different ones. And we're communicating about them and interacting in ways that we weren't before.
The barriers to entry are much lower, and our standards/expectations are higher/different than before.
Aol as a singular silo only lasted so long. Really, there were a few back then too, AIM(and icq)/MSN/Yahoo were probably the biggest and each had a good chunk of users that refused to switch to the others, and each one fell.
I also don't think we've hit AOL 2.0 quite yet, so we still have some time to see how it all works out.
The problem with IRC is that it now lacks some features that prevent its adoption from a wider userbase.
Outside of our tech circle, there's not many people who wouldn't be turned off by having to register an account by typing custom commands. Or chatting without avatar support.
It may seem silly, but I think many people are turned off by this.
Yeah, but that can easily be implemented in a client, have the client issue commands to nickserv. Same with avatars, just have a url in your VERSION to the avatar or something. All the features of eg. slack could be implemented client side on top of irc. I even run slack via their irc gateway. I handle file/image uploads with my tiny daemon running on http://paste.click and it's easier (for me) with keybindings than just dragging a file onto the screen, although it wouldn't be difficult to implement it.
And it's not silly at all, I understand it, I just wish it wasn't the case. I don't like it but I get it and I don't expect that irc will ever hit the mainstream. Maybe something similar will in the future.
That can indeed be done, although I fear that many clients and servers would implement these additional features differently, thus losing all the good that comes from IRC being an open standard.
I don't know if it's been attempted before... but I think it would be interesting to see something open and community-centered like IRC, but with all these little features people now expect.
It might well bee that the best idea going forward is actually to form up a couple of RFCs regarding nickserv and other bots (eg: channel loggers, nickname registration, what are they called, how do they work -- eg: /invite ChanLog -- /who #mychannel -> ChanLog in list -> indicate messages are archived for this channel -- that kind of thing), mandate TLs only connections -- and then have clients implement on top of that.
Perhaps some RFCs dealing with SRV-records (where is the web UI with channel logs?) -- or maybe even RFCs on how to mirror some functions (nickserv, logs) via REST-apis.
It's not critical at all, but it's something people have come to expect from an instant messaging program.
If our 'open' alternatives don't get those little details right, then it'll be harder for people to transition from whatever closed, proprietary service they're using to something that values their freedom.
> It's not critical at all, but it's something people have come to expect from an instant messaging program.
I still don't know what "it" is though.
> If our 'open' alternatives don't get those little details right, then it'll be harder for people to transition from whatever closed, proprietary service they're using to something that values their freedom.
I also doubt a feature-perfect clone would attract that many users. There needs to be some advantage to using the system, otherwise network effect prevail. (yes, that's a bit oversimplified, but I think it's valid none-the-less).
This is exactly why so many of us like IRC. It (mostly) separates the wheat from the chaff. Chances are you're not going to take the effort to get onto IRC to spend 30 seconds spouting a worthless opinion.
At the time, how much was really happening outside of AOL when it started to decline? My family and nearly everyone I knew had AOL in the beginning but what caused everyone to start moving away from AOL was always connected high speed internet. The keyword content at the time was still more interactive than a lot of websites, but the websites were good enough.
> The rein of aol was killed by stagnation and outside innovation, people seeing that there's more and better outside of aol, and the second round will probably die a similar death.
AOL was also a US-only thing, so when Internet started reaching other countries, they had no adoption.
Heck, living in south america I only even heard about what AOL was less than 8 years ago!
> AOL was also a US-only thing, so when Internet started reaching other countries, they had no adoption.
AOL was quite popular in the 90s/beginning of the 00s in Germany. What killed AOL there was the emerging availability of fast internet connections (say, DSL and later cable).
The Internet has been raised on the shoulders of giants - people who basically devoted their lives to it.
The current wave is brought up by these trying to grab a piece of the Web and keep others at bay, the game is ruled by investors. This bunch only understands walled gardens, no hope to talk them into open thinking. At all.
We've already had this discussed more than once that people themselves do not care about any particular approach, people just want something to work.
As long as commercial services do the job people will use them.
Like with any freedom movement there should be the underlying philosophy that will live long enough for others to finally catch on.
But going just head on like the author suggests will not help. If you start a new service the last thing you want to spend your resources on is a RFC. And you won't ever get more resources than the investors of AOL 2.0. Unfortunately.
> The current wave is brought up by these trying to grab a piece of the Web and keep others at bay, the game is ruled by investors. This bunch only understands walled gardens, no hope to talk them into open thinking. At all.
Sadly, my experience has been that this is true. Walled gardens allow the companies that control them to capture all of the value created within them, which is far more attractive to a VC firm than a company that's looking to create an open standard/platform that others will be able to use.
It’s the reason why long-term, VC firms (like Y Combinator) and the US startup culture have to fail. All of them focus only on short-term, exponential growth. That is not sustainable.
The only question is if humanity will die due to the environmental effects of exponential growth first, or if these companies default first.
Most YC startups are simply providing their own small service at the edge, without trying to take over their users' lives and dominate the net: http://yclist.com/
Beware of trying to judge phenomena by their news reporting on mass media, it's often biased towards outliers.
The unfortunate (or maybe fortunate) part is that data shows that occasionally, short-term exponential growth creates massive companies (ie - Google and Facebook) with relatively minimal marketing charges. I remember when Facebook was first available at my University. Seemingly overnight, all of the LiveJournals died and Facebook absolutely took control.
Technically, it would be not too difficult to use Facebooks API and some scraping and make it connect to gnusocial for example. But it is explicitly against the TOS (of the API and Facebook itself) to do so, so you can't.
I would be really, really happy if lawmakers would realize that Facebook (just one example of many) has now become infrastructure, and has to be regulated as such. What to do? Make every service with more than, say, a million users required to offer federation. There must be an API, and everybody must be able to get their data in and out free of charge, and there should be no limitations (except for fighting spam).
This way, Facebook would loose their network effect which makes it impossible for a competitor to be successful. They would still have their users, rich features, mature codebase, and infrastructure as an advantage, so this will not drive them out of business. But they will be forced to compete with others on terms of user-friendliness, features, non-invasiveness and so on.
Here's an idea. The next time you invent a nifty protocol for social, don't make it proprietary. Don't release the source under copyleft either. Make a license such as: "You may use this protocol / implementation free of charge in any way you like (source closed or open) under one condition: you have to offer federation."
(* Federation means that users with accounts on different services can communicate freely, like it is the case with email. Imagine you could use your Twitter account and write to a friend's Facebook wall. Or you could even host your own gnusocial instance and use it to take part in other social networks.)
> Make every service with more than, say, a million users required to offer federation. There must be an API, and everybody must be able to get their data in and out free of charge, and there should be no limitations (except for fighting spam).
I totally agree. Setting a minimum of users would also help the controlling of the system and the startups getting on their feet immensely.
> "You may use this protocol / implementation free of charge in any way you like (source closed or open) under one condition: you have to offer federation."
I'm not working on a protocol but I wonder if such a license already exists. I would financially contribute if one were to hire a bunch of lawyers to write one.
I'm not certain if forcing businesses to comply with custom-tailored laws is more ethical than businesses forcing customers to comply with restrictions. Neither of these outcomes is an ideal solution.
Getting lawmakers involved is a very poor move, and only opens the door for more foul-play and market manipulation. At this point in time, why are we not content with no solution to this problem? Things take time to resolve - and we certainly don't need to be jumping the gun by getting lawmakers involved.
I would love the crap out of whoever managed to swing that. I'd never thought of it that way, actually, but you've hit the nail on the head as far as the intersection of network effect and therefore counting as infrastructure goes.
> when the last user of it finally gives up and moves to gmail so they can continue to communicate with their contacts or maybe they give up entirely
Between the title and the quoted, I think you have used up your hyperbole quota for one day.
I guess I don't see the issue. The internet is a large piece of infrastructure on which citizens and companies can publish (almost) anything they want. Lamenting that for-profit ventures have tried to wall off their parts is curious, as I am not sure how that impacts me or my decisions. I don't facebook, I don't tweet, and I could not care less that these things exist. (Not entirely true, but my objections to them would be off-topic.) The world is a very large place, and there will always be people who hang out in the more distant corners of the net, you can go be with like-minded people and talk about the good ole days of (insert bygone era here).
Now if you are lamenting this because you want a piece of the action and the big kids are being bullies, then I suppose my answer isn't going to comfort you any. But for just simple usage, I think this is a tempest in a teapot.
The issue is anyone with a web browser (even ones provided by the ones with walled gardens) can use the HTTP protocol to see hypertext documents. But Facebook users aren't using a public protocol in the Facebook messenger, so they can't easily communicate with Twitter users, who also don't use a public protocol.
It's not the end of the world, but allowing more people to connect more easily is a good thing. The larger issue is that people are less likely to connect once they find their comfortable niche. You don't use Facebook or Twitter but someone who does and would be interested in your thoughts are less likely to stumble across you.
I think there are practical concerns too that push away from openness and towards silos - it isn't just about money.
Open protocols can be slow to improve because of the very nature of their openness - they have dependencies and need to remain interoperable.
Closed protocols can adapt faster because one entity controls the entire thing. In the case of chat XMPP was in the lead for a while, but the mobile situation was terrible - connections were constantly dropped and even though there were clients that supported multiple accounts (Meebo) it was a pretty bad software experience. The current messaging products are not open, but they are better.
This is a shame I think - because it'd be a lot better if the open products were actually better, not just better because they are open.
Maybe the trick is to hack on a better open protocol, but even then it could similarly be outpaced. Maybe the path to this is defining an open 'social' protocol and fixing the internet identity issue at the same time. Maybe keybase.io can do this?
You don't care that gmail seem to randomly mark some not-from-gmail mail as spam? Which leverage Google/Gmail's marketshare to push people away from all other email services?
The key (IMO) to a distributed and decentralised Internet is a net-equality for upload:download speeds. There used to be an argument that we download more than we upload. To phrase it another way: we used to consume more content than we published. The fact is, we individually produce more content now than we consume, much more than the traditional media companies of the previous generation, as evidenced by the content on Facebook, Google/YouTube, Twitter, WordPress, et cetera. But the Internet in 2015 is an AOL-2.0-like one and we need to nudge it in the direction of a distributed one again.
The solutions are within our grasp because they exist already. The software and the communication protocols exist, and are open and tested. A small change at the root, like Net-Equality, would result in an avalanche effect change in how we communicate and store information. The Internet of 5 years from now could be vastly different (and better in my opinion) than the one we have today. It could fulfil the 1990's predictions of people like Bill Gates and Mark Andreesen, who now sing a different tune, but used to preach the power of a distributed Internet.
But what happens when my home internet connection goes out, or I don't have an always-on desktop machine?
That's also an inequality in upload/download, and it's far more important than just a bit of imbalance in total bandwidth. Especially when most of that download bandwidth is watching popular (and probably commercial) videos.
I'm not sure which side I'd put the typical "no servers" TOS, since if enough people ignored that it would probably go away.
But what happens when my home internet connection goes out, or I don't have an always-on desktop machine?
The content can get distributed and hosted right after you publish it, ensuring access even if your machine goes offline. See Freenet, Bittorrent, etc.
There are extremely good reasons to cap consumer upload bandwidth. For example, the majority of DDOSes start from consumer machines, not servers: enormous DDOSes become trivial even without a botnet without capping (and part of the reason enormous DDOSes are becoming more common is relaxation of upload bandwidth restrictions--let's not beat around the bush).
What an incredibly crappy reason. That's the 'net' analogy of 'but what about the terrorists'. If such DDOSes are that much of a problem the correct response is to have better mitigation and/or detection processes in place not to limit the normal use of the net.
By that reasoning we should disallow broadband entirely.
That's a bandage on a broken limb approach. The problem is that we have naive routing protocols that trust every packet received. The various BGP fiascoes have shown that we need to rewrite the routing protocols with the assumption of distrust. Until that happens DDoS will always be the pain in the butt that's always been.
Are you familiar with DDOS attacks? Current mitigation strategies are literally "have more bandwidth than they do" and null routing. Given that null routing effectively means people have accomplished their goal of bringing down a site, the only way to mitigate that is to not allow DDOSers to acquire enormous amounts of bandwidth.
It's true that a lot of the biggest issues would be mitigated if ISPs would consistently perform packet ingress filtering, fix open DNS resolvers, etc. But to act like this is an easy infrastructure problem is to ignore the technical challenges inherent in the problem.
> That document would then be sent out to various parties that might have an interest in using this protocol who then would supply their feedback and as a result of that a more complete version of the original document would be released. And so on until the resulting protocol was considered mature enough for implementation.
Is that so? The early IETF process was, "Rough consensus and running code."
Protocols were implemented first, then revised and described in RFCs later, weren't they?
I blame firewall admins for this. If app was using port 1234, it was simpler just to block this port. After while this make app unusable so app creators tunneled everything through port 80.
And I blame lazy software developers who had security holes in their apps that endangered enterprise networks. I'm an app developer and we shouldn't be placing blame on someone trying to do their job and protect their internal network. We need to own up and start making security a priority over yet another feature someone wants in their app. Yes we were quite clever to tunnel things through port 80, but we're just delaying the inevitable.
I partially agree but isn't the port blocking mostly about reducing the attack surface? If I were managing infrastructure (I never did), just trusting a browser and its security model sounds better than following bunch of white-listed apps for zero-days.
In some ways, it may be worse than the OP suggests as we apparently race towards mobile apps. Seems about the right time for PointCast+. Its the proverbial technology pendulum. It seems obvious that we'll continue our momentum and maybe even abandon the generic browser and revert to the (now mobile-based) fat client past.
Much of the article seems devoted to concerns about proprietary protocols and closes with a plea to keep protocols open with a published specification. I think this is a reasonable concern and a noble plea, but yet also an (unfortunately) unreasonable request. Companies are developing their protocols to support their needs, not yours.
Its reasonable to want access to the data companies are collecting. Its valuable data. But these companies are not in the data services business. And if they were, its reasonable for them to charge a handsome price for the access you are looking for. Imagine the black eye a company would get for trying that approach.
Perhaps there's a view that says, users freely give the data so they should be able to freely get it back. But there's a counter argument that users are "compensated" for contributing their data in the form of free access to the services it provides. Given the amount of time people spend on social apps, its reasonable to think they value the "free" service highly.
IIRC there was a start-up that attempted to build an alternative social network based on open standards... It didn't go very far. Perhaps people need to learn to vote with their actions. Maybe we need to try that again and convince more of the influential technical folks to jump ship. Maybe through competition the non-open establish players can be shown the value of openness.
I suppose personally serving all of one's own data is the only reliable way for users to "freely give their data". As you say, one can't expect service providers to be their remote harddrive so they can get their data back. Service providers are in the business of presenting one's data through the lens and filters of their service.
Personally serving one's own data such that a service can lens and filter it would require an entirely new protocol. It would need to have the data properly secured so the parts of you that are an open book (cat pics, blog posts) are served freely, and secrets (bitcoin wallet, personal notes) are not.
For apps that shouldn't be centralized, processing related to your shared data would need to be distributed such that a service could be formed entirely from peer-to-peer user nodes. In a perfect world, no central organization could hoover up all the data and do their analytics on it/share with the repressive governments.
For availability, you'd want your data to replicate to many trusted peers so that you could always get at it. As if the world were one giant cassandra ring, you'd always want to reach consistent states eventually.
It seems like there are too many hard computer science and networking problems for this to be a community effort (at least any time soon...)
I agree that user data is driving the "free" service of social media. But I don't see your point when you say an open social media implementation failed because it was open. You imply that Facebook succeeded since it was a closed protocol implementation.
Why did Facebook succeed? Would have it succeeded if it was open? Perhaps market needs and competition scene has more to do with success than the nature of protocol. What percentage of users even care about protocols?
As for the "failed because it was open" comment. Sorry, I didn't mean to suggest that it failed because it was open. Merely that it was open and failed (apparently maybe not so much failed).
I had thought it failed for the obvious reason: not attaining the necessary critical mass that a social network needs. One of its primary propositions was an open data policy. Apparently this might not have been sufficient to launch it into success. Again, there's still something there so maybe its still rumbling on.
We're in a world of APIs and de facto protocols now. The bulk of the interesting communication at this point is about data. REST over HTTP provides a decent mechanism for interacting with arbitrary nouns. Anything significantly more structured would, I imagine, essentially be an exercise of modeling a particular domain of data. With rapid innovation, it becomes difficult to codify a particular data model ahead of time, so we end up in a world of RESTful APIs, some well-documented and some not-so-well-documented, and, when one emerges as the winner, it becomes the de facto protocol.
That said, the walled garden approach of all platforms today (especially Apple) certainly endangers future innovation. Though, I see that as a content delivery problem, not a protocol problem.
>we end up in a world of RESTful APIs, some well-documented and some not-so-well-documented, and, when one emerges as the winner, it becomes the de facto protocol.
Can you give an example of a proprietary RESTful API that then became a de facto standard protocol that was supported by a wide variety of software (both clients and servers)?
If I want to use twitter-the-thing, I have to use twitter.com-the-service. No other service offers decent twitter. There is gnusocial.de and quitter.se, but their twitter is not as feature rich as twitter.com's twitter, and also twitter.com refuses to federate with their accounts. That means you have no access to twitter.com's biggest asset, their user base, which is pretty rude if you ask me.
Just like I use google and several other services that are simply too big now to get around. I used twitter from long before they closed their offering and I'm still pissed off about it. Use an open policy to attract developers then close as soon as you have achieved critical mass is a very very nasty strategy.
"Since then there has been a lot of movement on the web application front but none of those has resulted in an open protocol for more than one vendor (or open source projects) to implement."
I know of several RFCs that are in the process of being created/approved that specifically detail protocols over REST/JSON. Probably the most impressive is the System for Cross-domain Identity Management (SCIM) [1].
In general I agree with the sentiment of the OP - that we should have more standardized protocols if HTTP is going to be the new TCP. I should also mention that HTML is a data specification for pages transported via HTTP, so there's definitely precedence.
The Death and Life of Great American Cities by Jane Jacobs strongly outlines that this pattern is a problem of segregation in almost all communities. The best superficial designs are static and proprietary. The best, long living designs are open and engage with inevitable chaos at scale. High end segregated suburbs eventually die, as people move to open, less segregated environments that provide more serendipitous opportunity.
It seems to be natural in complex networks that proprietary systems that don't serve the entire population function better due to like-mindness, conservation of resources by the wealthy, and bias.
Apple is like a private community for those who can afford it. Systems and resources can be allocated centrally, and designed for a limited number of static requirements. Its great, until requirements change.
I would argue that Google should continue to support the open environment it is designed on. Its software is inherently linkable (you can link to documents, G+, youtube videos), it is mostly open to non-logged in experiences, and it is free for anyone.
It comes at great cost to support those ideas, sometimes in the form of money, and sometimes resources, but I think its absolutely worthwhile.
Abstracting problems is not always the right solution, but for some reason (I think it's mostly a social effect) seem to have an overwhelming desire to abstract away problems.
"XMPP support is slowly but surely being removed (just imagine a phone system where every number you call to may require a different telephone)"
So what happens in these kinds of cases is that somebody invents an abstraction (a metaprotocol) that cross-connects all of these. And then somebody comes up with yet another protocol that doesn't fit into the metaprotocol (or some slow moving standards body can't fit the new one in), so somebody else comes up with a metametaprotocol that bundles the metaprotocol and the new one in....and its abstractions all the way up..
Until somebody realizes that the tower of abstraction is introducing 300ms of lag into things and we all pine away for the good old days of just XMPP or whatever.
There's nothing that prevents technically oriented people from setting up ftp, nntp, smtp, mtp, etc. servers and so on except that these things really tend to rely on some kind of software running on the user's OS and these days people pretty much just run a browser and not much else.
The answer then is move the functions those protocols supported into the browser (like in the way that most browsers support ftp), but it's just not worth trying to shove telnet or whatever into browser support, so people just replicate whatever telnet is supposed to do into a web app that's access over http.
I think more troubling though is non-browser server-server communication is now just http. Sure it's pretty simple, but for server-to-server communication there's almost always a better protocol to use, but people can't be bothered to come up with one.
The whole problem goes back to the early days of the web when everybody was focusing on making it easy to consume the content rather than making it easy to produce the content. To this date there is no application equivalent to Mosaic 1.0 that does not require either some 3rd party service or the capabilities of a systems administrator.
How hard could that be? And is it still possible (with the TOS of many providers now stating flat out that you can't run any servers on your line).
I was just thinking this morning, way back in the dial-up internet days, I used to go around helping people setup web pages on their free 5MB of space the company I worked for offered. A surprising number of our customers, even elderly ones, went out of their way to learn basic html and put something up there.
Over time we got geocities and various CMSs and such to fill that, but the function of what most of those people were putting up is just their facebook wall these days, only it's a bit more transitory.
I remember one old gentleman, a WW2 vet, who spent hours every day building a page about his dog so that he could share it with his friends and family. It was a bit odd, but my FB wall is full of similar kinds of things.
So I guess it also comes down to "what's the end goal the person is trying to achieve?" if it's just to keep people updated on their dog, FB more or less fills that function. Most people actually weren't interested in building random webpages, it was in communicating something to people.
A friend of mine said: people have an unlimited capacity for wanting to communicate. And I guess facebook is the end-result of following through on that desire. Which answers what will replace facebook: something that makes it even easier to continuously communicate. Definitely not something federated but something even more siloed.
Something that's frustrating with FB is that it is so temporary and continuous. This is fine for most day-to-day updates (news, gossip, etc.). But it's not great for long-term reference material. You might notice that people don't really write long FB posts, part of it is likely that it just isn't worth it as it it'll be off everybody's walls in the next 24 hours.
If FB added some kind of "archive this post" so people could save particularly great posts or conversations, it would fill this gap. But I have a feeling somebody else will get to it first.
I think you (finally!) nailed why twitter is successful. Updates are short by nature, real content requires more space and has a much longer 'best before date' and so does not fit either facebook or twitter.
That should have been the thing that google+ aimed at rather than doing the same thing all over again. The content would eventually achieve critical mass all by itself.
The massive asymmetry between producers and consumers is a repeated pattern across many domains regardless of technology.
-- of wikipedia readers, less than 1% write articles
-- in web forums, most are lurkers and less than 1% write posts
-- movies: millions watch them, less than 1% produce their own movies
-- books: millions of readers, relatively few authors
-- cars: most drive them, very few build/rebuild them (as a hobby)
, etc.
Another example is blogging. Today, for a novice, it's more friction-free than the 1990s Geocities days to set up a blog and share one's thoughts. However, the vast majority are not interested. It seems like an imbalance that we'll always have.
So it's not like we conspired to have everyone be dumb consumers. Perhaps the internet evolved that way because it is a reflection of what we are.
I agree that it would be helpful if we had more open protocols but I disagree with the conclusion that not having them inevitably leads to AOL 2.0.
First, I think it's helpful to clarify what jacquesm is saying. My interpretation in concrete terms of URI[1].
URI: <scheme>://<location>:<port>
As of 2015, the scheme is almost always "http" or "https" and the port is almost always "80" or "443".
Because of many factors (I'd say mostly social dynamics[2], not technical), companies end up layering their proprietary/opaque protocols on top of "http".
I believe the essay asserts that the web would be more "free" and "open" if we pursued more "schemes"[3]
For example, if you're starting a new company that lets people crowdshare cooking recipes, the 1980s mindset may have been to submit a new RFC so everyone could then do:
cooking://johndoe.com
(and default port for "cooking://" would be 867 or whatever)
Instead, we now have a situation where we have a cooking REST API or cooking iPhone/Android app that sends proprietary undocumented bytes over "http". Yes, the "http" is open in an academic sense, but for practical purposes, the cooking data is "closed" because the bytes over it are proprietary. Related to this is that the bytes go to a central entity that has self-interested economic motives instead of a peer-to-peer situation like "ftp://" or SMTP email.
I don't have to time at the moment fully explain my disagreement but I don't believe this leads to AOL 2.0.
Instead, what happens is that the proprietary protocols simply become more inefficient by "tunneling" or "layering" in or on top of "http". We collectively waste lots of HTML/MIME overhead bytes to send opaque data. We also expend a lot of security effort with cat & mouse "deep packet inspection" of "http" because of this. Lots of cpu cycles are burned up to pay for these inefficiencies.
As for pushing for the ideal of more schemes with public and transparent RFCs, this is a goal that's couched in technical terms and it hides what we're really asking of each other: we are asking economic actors to forego their economic interests. That is a very tough sell.
For example, in the USA, there's Intuit Quicken and they have a proprietary transport for downloading financial transactions. I loathe Intuit and their business practices but I do understand why their protocol is opaque instead of being an RFC. They are the ones that did all the legwork of convincing the banks to open up their mainframes to facilitate data downloads. The government didn't grease the wheels; Intuit did. Now they want to be rewarded economically for it. Can't blame them. That's why it's not an open RFC.
Contrast Intuit's situation with the public RFCs for "ftp://" and SMTP. The people creating those protocols were economically secure. They had stable jobs at universities or government institutions.
Today, the opaque protocols are made by actors in a different economic landscape. If a YC2015 company takes $130k from ycombinator and several million $$$ from VC rounds, defining a transparent RFC that harms their economic interests doesn't make sense. This goes against human nature. The essays pushing for a more open web are not addressing the hard-to-resolve economic interests.
[2]corporate firewalls and network appliances policies of restricting ports, users' home computers' firewalls default config of blocking ports, etc. New software can't expect to reverse the conservatism of Cisco admins, etc. Also look at why residential SMTP mail servers are blocked by others even though SMTP is theoretically "open". That's a social problem caused by spam.
It's shocking that so few people in a forum that is basically dedicated to the business of information technology have such a blind spot to the economics of our industry. Yours is the only mention of the word "economics" out of many dozens of comments.
You'd think by now that we'd have a way to treat digital media as property that can be bought and sold on a marketplace. This price discovery would then make a great entry in to a book of accounts.
The current info-economic system is mainly selling advertisements to pay for nothing more than servers and other operation costs. The content itself is worthless, not because it doesn't have costs in making it or is something that consumers don't value, rather because we've purposefully kept it out of a marketplace.
These VC funded information technology companies only have one business model: create and exploit a private monopoly. Control identity. Authorize the use of any and all interactions with data.
I'm not going to discuss my thoughts how to fix this because my opinions on the matter are incredibly unpopular on HackerNews and frankly, I'm sick of feeling like crap after getting a bunch of silent downvotes after I've spent time writing something thoughtful.
Yes, I also think that the HTTP vs other transport protocols is a red herring to the standard vs proprietary part. Raw protocols can be and often are proprietary, and HTTP based protocols can be and often are standard.
I think it's more a matter of correlation: the HTTP-ization of the net coincided with the rise of commercial endeavours taking over the position of universities and other research institutions in the development of these protocols, but it didn't cause it.
I've been saying the same thing, but I feel this goes too far into an alarmist view.
When Facebook, Google, and Apple hold all the cards the creators, the developers, will move back out and create the 'old' Internet again. Then the people will follow to the new creations. Eventually we'll hit AOL 3.0 and the cycle will continue.
There are many reasons for this, but a big one only hinted at in this article is the firewall. Firewalls and NAT effectively break the Internet. They require centralization -- two devices cannot communicate without it. We've adopted a topology that forces centralization, and now we're surprised at the result.
I think the author makes a lot of good points, but misses a key point. We're not shuffling off into walled gardens because Google and Facebook want to "own" their users -- they do, of course, but users aren't just being herded like cattle by Facebook. Users are getting into walled gardens because walled gardens offer users a lot of advantages right now.
1) Internet users are becoming less technical as the number of Internet users increases, and they want things to just work. They don't want to know whether their mail server is POP or IMAP, they just want to send and receive e-mail. They don't want to have to try and figure out which IRC server to get on to get a decent ping or to avoid a netsplit, they just want to chat. Zero setup, zero installation, just go to a webapp and all those details are handled for you.
2) Internet users are becoming increasingly mobile, and most of our pre-HTTP Internet protocols scale poorly over mobile. Mobile devices are power and bandwidth constrained in ways that protocols didn't envision. Chat applications not written with mobile in mind are giant battery hogs.
3) The open web is a dumpster fire. SEO makes a lot of Google search results difficult if not impossible to wade through (try searching for information on a particular printer or lawnmower and see how many results you can find about anything other than someone who wants to sell you one). An article with pictures that measures under half a megabyte comes with four megabytes of ads and trackers. If you peek behind the spam filters, something like 90% of the e-mail anyone gets is best described as junk. Some of it's malicious junk at that. Most open forums quickly degenerate into a showcase of the worst humanity has to offer -- go check the comments on a newspaper or TV news website if you don't believe me.
4) Nobody can make any money except through advertising, and even that's becoming problematic. The problem with things like subscriptions is that they reduce the value of the hyperlink towards zero. And you can say, "for just the price of a cup of coffee you can keep this website open," but having to buy 50 to 100 cups of coffee every month to support all the websites you like to visit at some point during a month quickly becomes untenable. And so advertising comes to dominate the web in increasingly perverse ways (which explains about half of point three). Walled gardens are one way to offer a respite from the worst abuses of advertising -- Facebook doesn't need to send you 5mb of JavaScript to track you, they can do a much better job of tracking you with a lot less overhead than third-party trackers can.
If you want to fight walled gardens, you need to find ways to solve these problems in ways that don't involve walled gardens or offer ordinary users (users who don't share this community's convictions that open is inherently better) benefits that a walled garden doesn't or can't.
In fact the walled internet gardens remind me more than a little of the walled communities in Snowcrash. In that story the USA has been 'Balkanized' across commercial lines, marginalizing the federal government in favor of security and stability.
Folks use the part of the internet that works for them. Its the largest crowdsourced community-building experiment of all time.
I remember AOL it was a dial-up network with its own custom GUI that set up a TCP/IP stack at the time that getting on the Internet with WINSOCK programs was difficult because you had to configure a SLIP or PPP profile with other ISPS.
AOL made it easy for anyone to get on the Internet, they mailed out floppy disks for a free month for example. The hard part was canceling the service and getting them to stop billing you once you stopped using their service.
AOL software grew and went on CD-ROMs, at least with floppies you could reformat them and use them for other stuff, the CD-ROMs were drink coasters and made wind mobiles out of them.
AOL software grew when they merged with Time Warner, the AOL software was a web browser and media surfer and email client.
When broadband began to take over there was a $20/month option to add in your AOL account to it and use it over broadband.
If you want AOL 2.0 today you have to have a phone company or cable company bundle their broadband service with the software that is part web browser, chat client, email reader, news reader, media surfer, stock ticker, and other stuff rolled into one app.
Google or Apple if they wanted to do this would have to lay down some fiber optics to provide broadband Internet access and then bundle their AOL 2.0 software with it. It is not a cheap thing to do, the dial-up modem was better because you could plug a modem into any phone line and get Internet, but it was slow. Today you need a high speed phone line and a DSL modem, cable modem, some sort of high speed modem that uses fiber optic, etc. They company has to send someone to install the high speed cable and modem to hook you up, and it is not so easy to do that these days. You have to wait for someone to be available to do that, and hope there is no technical delays.
AOL was killed by broadband Internet and Web 2.0 sites replacing what AOL software did.
Oddly enough you still find the elderly using AOL dial-up for $35/month on really old Windows PCs. They don't want to lose their aol.com address.
I think one solution would be for internet providers to upgrade their offer. They started by offering you an email box, then stopped. Some offer storage space, and a few of them offer blog platform, but they've all surrendered against facebook.
Why wouldn't internet providers offer your own personnal profile page and news feed, only instead of being stored and owned by a company whose respect for private data and business model seem contradictory, let someone you already pay something for handles it.
Then we would need protocols again, because internet providers are numerous, and don't need to dominate the world to be profitable.
Note : i can think of many other services that would be good candidates for an isp to offer : youtube, linked in, photo sharing, etc.
I would not be excited about my ISP offering these extra services. I just want them to be dumb pipes- deliver my data without messing with it in any way. That should be their only business concern, and all innovation efforts should be spent on improving their network.
Because being a simple provider of infrastructure removes any added value from your product and opens you to a price race to the bottom. Of course there will not be a slew of new ISPs given the cost of the infrastructure but it is never a good feeling when you know that people can simply ditch your service and go elsewhere without any cost.
This is what happened to the mobile market, to the PC market, is currently (somewhat) happening in small scale public transportation and (at least in Europe) has happened to food producers.
But there are several ISPs out there. Granted, it is a very hard market to get into but nevertheless there is (most of the time) more than one. Personally I'd like the Internet infrastructure be handled more like roads (i.e.: the infrastructure is technically owned by the government but built by and rented to private companies)
The downside is that you're then locked in to your ISP. People used to be trapped with their ISP just to keep their email address. It was terrible. If they host your blog on their own domain name, you're again trapped with them.
I see what he's saying, and he makes some good points.
But I think he's too pessimistic. There are still some bright spots of open protocols on the Internet, even if they're now predominantly used through web browsers and/or HTTP.
I'll name two that are more recent than the last RFC referenced in the article, from 2009:
JMAP - the JSON Mail Access Protocol, here to save us from the dark ages of IMAP. http://jmap.io/
WebRTC - cross-browser, cross-platform voice and video chat without being locked in to a single provider like Skype. http://www.webrtc.org/
Heads up, this post is great but it really needs a once over editorially:
NNTP has been ‘mostly dead’ for years (though it still has some use the real replacement usenet for discussion purposes appears to be Reddit and mailinglists) and so on.
Good post! Protocol Handlers aka chrome://settings/handlers are really really badly done or utilised. If there were done better, maybe we would see some more interoperability between the silos.
Sandstorm[0] and projects like it are the future of the internet and the digitally connected world. Data silos, privacy, inability to interoperate, data ownership will all collude to bring down the wall. At the same time, virtual compute providers and software like sandstorm will become ubiquitous and intuitively easy to use. I, for one, am long term optimistic.
ISTM this is just the end-to-end principle taken one step farther. Instead of intelligence residing in the endpoint node, now it resides in an app running on the endpoint node.
That isn't to say that the "aol2" phenomenon doesn't pose the risks to users described in TFA. I only observe that the same factors that encourage end-to-end (flexibility, reliability, loose coupling, etc.) probably encourage "aol2" as well.
No, it's a moving of the "ends" and insertion of a middleman. If I write some text and publish it to people, rather than sending it from my computer to theirs I send it to an intermediary, who reformats it and presents it to other users of that intermediary.
TFA is really talking about two things. I responded to the "layered on top of HTTP" point, while you seem to be talking about the "silos of end user data" point. Perhaps the first point contributes to the second, but I'm not convinced. Users could choose http apps that don't silo.
It strongly tends to, because having your own computer (especially a phone) set up to answer inbound HTTP is both administratively difficult (fiddling with NAT and dyndns, etc) and a security risk.
This is Schneier's argument about "security feudalism". And to some extent touched on by the article - people build on top of well-understood transport layers in order to reduce risks, and novel protocols are firewalled in case of novel risks.
I had a similar feeling the moment I started seeing browsers hide the protocol in the URL bar as much as possible. The protocol is important and I didn't like them munging the protocol and address together.
Calendaring is a great example. At least it's possible to get a unified view of my work Exchange calendar and my Google personal calendar, as well as manage both through the web, Siri, or any number of apps on every platform.
That's not possible without an open application protocol. I can't get a similar unified view of my Twitter and Instagram follows, for example.
Or maybe you expect twitter to write a rfc about its api, which for obvious economic and strategic reasons they don't want to open? WhatsApp maybe? How would benefit them to have 3rd party clients they wouldn't be able to control? Or having to justify or change their protocol decisions with an rfc?
I don't understand, I'm genuinely confused, this article sounds very naïve, very delusional, so probably I am just misunderstanding it.
Remember twitter and the discussions when they nuked third-party clients?
And yes, messengers not working with other clients is one of the most user-unfriendly things they do and something many users consider a bug. And really only has taken off with the start of mobile messengers, before that most could be interfaced from different clients.
And yes, messengers not working with other clients is one of the most user-unfriendly things they do and something many users consider a bug. And really only has taken off with the start of mobile messengers, before that most could be interfaced from different clients.
I think that's somewhat misleading. MSN Messenger, Yahoo!, etc didn't use standard protocols, it's just that people would reverse engineer them. Third-party clients did just stop working from time to time whenever the providers changed something in the protocol.
They're still around - WhatsApp has had multiple unofficial clients pop up, for example, though the company is much more heavy-handed in taking them down (DMCA takedowns against libraries that use the API, banning users of unofficial apps, etc).
This is the entire point. I wish communications happened over an open protocol instead of over Twitter's protocol, not because it benefits Twitter, but because it benefits me.
Email is open. For some reason people think of email as different. GMail could've provided the exact same conditions and experience when they launched, but only enabled GMail-GMail communications. It would never have got to where it is, never.
And there's obviously no technical reason instant messaging, social networking, etc., should work any different, but for some reason it's perceived as different.
It may be naive, but the reality is that it wasn't that uncommon. Before WhatsApp, GTalk did support a standard protocol. Mail providers all use standard protocols. RSS used to be "the future".
I would say that GTalk only supported standard protocols from the start to gain traction. Email is probably too widespread and decentralised to be handled differently.
The rein of aol was killed by stagnation and outside innovation, people seeing that there's more and better outside of aol, and the second round will probably die a similar death. People will start to see cool new things happening outside their silo, or get fed up with them and the silos will eventually fall. These trends seem cyclical, we go from mainframes and silos to personal computers and an open network, then back to the same mainframes, this time called the cloud, and the same silos. It's not gonna be a fun time for those of us who don't like the confines of the new internet, but the handful of us who care won't stop the inevitable.
Me, I still run an irc server because I can, because there's a million clients to choose from, and everyone can have their own interface, and the protocol is so simple a decent programmer can hash out their own in a day. Nothing has come close to it yet (xmpp was great in theory but too complex for its own good). If you want a pretty ui and emojis and images, well there's a client that can do it, if you just want text, plenty of those too.
I'll be over here riding out the inevitable with my own file sync, git server, chat server, web server, and making my internet how I want it to be, hoping that in time people will notice that you don't have to just accept your preferred silos interpretation of it.