Hacker News new | past | comments | ask | show | jobs | submit login

Abstracting problems is not always the right solution, but for some reason (I think it's mostly a social effect) seem to have an overwhelming desire to abstract away problems.

"XMPP support is slowly but surely being removed (just imagine a phone system where every number you call to may require a different telephone)"

So what happens in these kinds of cases is that somebody invents an abstraction (a metaprotocol) that cross-connects all of these. And then somebody comes up with yet another protocol that doesn't fit into the metaprotocol (or some slow moving standards body can't fit the new one in), so somebody else comes up with a metametaprotocol that bundles the metaprotocol and the new one in....and its abstractions all the way up..

Until somebody realizes that the tower of abstraction is introducing 300ms of lag into things and we all pine away for the good old days of just XMPP or whatever.

There's nothing that prevents technically oriented people from setting up ftp, nntp, smtp, mtp, etc. servers and so on except that these things really tend to rely on some kind of software running on the user's OS and these days people pretty much just run a browser and not much else.

The answer then is move the functions those protocols supported into the browser (like in the way that most browsers support ftp), but it's just not worth trying to shove telnet or whatever into browser support, so people just replicate whatever telnet is supposed to do into a web app that's access over http.

I think more troubling though is non-browser server-server communication is now just http. Sure it's pretty simple, but for server-to-server communication there's almost always a better protocol to use, but people can't be bothered to come up with one.




The whole problem goes back to the early days of the web when everybody was focusing on making it easy to consume the content rather than making it easy to produce the content. To this date there is no application equivalent to Mosaic 1.0 that does not require either some 3rd party service or the capabilities of a systems administrator.

How hard could that be? And is it still possible (with the TOS of many providers now stating flat out that you can't run any servers on your line).


I was just thinking this morning, way back in the dial-up internet days, I used to go around helping people setup web pages on their free 5MB of space the company I worked for offered. A surprising number of our customers, even elderly ones, went out of their way to learn basic html and put something up there.

Over time we got geocities and various CMSs and such to fill that, but the function of what most of those people were putting up is just their facebook wall these days, only it's a bit more transitory.

I remember one old gentleman, a WW2 vet, who spent hours every day building a page about his dog so that he could share it with his friends and family. It was a bit odd, but my FB wall is full of similar kinds of things.

So I guess it also comes down to "what's the end goal the person is trying to achieve?" if it's just to keep people updated on their dog, FB more or less fills that function. Most people actually weren't interested in building random webpages, it was in communicating something to people.


A friend of mine said: people have an unlimited capacity for wanting to communicate. And I guess facebook is the end-result of following through on that desire. Which answers what will replace facebook: something that makes it even easier to continuously communicate. Definitely not something federated but something even more siloed.


Something that's frustrating with FB is that it is so temporary and continuous. This is fine for most day-to-day updates (news, gossip, etc.). But it's not great for long-term reference material. You might notice that people don't really write long FB posts, part of it is likely that it just isn't worth it as it it'll be off everybody's walls in the next 24 hours.

If FB added some kind of "archive this post" so people could save particularly great posts or conversations, it would fill this gap. But I have a feeling somebody else will get to it first.


I think you (finally!) nailed why twitter is successful. Updates are short by nature, real content requires more space and has a much longer 'best before date' and so does not fit either facebook or twitter.

That should have been the thing that google+ aimed at rather than doing the same thing all over again. The content would eventually achieve critical mass all by itself.


Maybe the world is ready for Google Wave now.


>How hard could that be?

The technology would not be relevant.

The massive asymmetry between producers and consumers is a repeated pattern across many domains regardless of technology.

-- of wikipedia readers, less than 1% write articles

-- in web forums, most are lurkers and less than 1% write posts

-- movies: millions watch them, less than 1% produce their own movies

-- books: millions of readers, relatively few authors

-- cars: most drive them, very few build/rebuild them (as a hobby)

, etc.

Another example is blogging. Today, for a novice, it's more friction-free than the 1990s Geocities days to set up a blog and share one's thoughts. However, the vast majority are not interested. It seems like an imbalance that we'll always have.

So it's not like we conspired to have everyone be dumb consumers. Perhaps the internet evolved that way because it is a reflection of what we are.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: