> P2P never made it to the mainstream; instead, centralized systems still (or, again? even more?) dominate the landscape. So, because it failed, should we try Web3? I doubt it, as it is more likely to fail, due to higher complexity and more dependencies.
One major reason decentralized has a hard time is because standards, public protocols cannot be innovated on rapidly. Or at least, new iterations cannot be deployed universally very quickly. So centralized / proprietary channels rise to the top because new ideas can find expression in them more rapidly.
Those "open standards" are often dictated by the whims of giants like Google. That's fine when the interests of buisnesses align with their clients, but a recent one that comes to mind is FLoC
Edit: I think this is why we have seen the emergence of standards like WebRTC, WebGL, etc. that enable "web apps", for better or for worse.
I wouldn't say that P2P never became mainstream. BitTorrent is still huge and a great way to share media with each other. The reason it's not mainstream is because without centralisation, you can't enforce copyright and DRM. If sharing movies with friends was legal, P2P platforms would sprout like mushrooms after rain.
I personally think, perhaps foolishly, that if we decriminalise nonprofit filesharing and at the same time make supporting creators easy and not tied to buying the media, we would have a better system in place. I would love to see some of the smaller creators on Patreon experiment with giving every Patron a transferrable license to reproduce their work, as long as attribution is given.
I base my assumption on the fact that piracy has been a constant presence for the last 20 years and the entertainment industry doesn't seem to have suffered one bit. In fact every time they try to kill piracy all they end up getting is negative PR.
I don't think that's right. Using decentralized protocols doesn't necessarily imply decentralized authority or use of those protocols. Skype used to be P2P, and the mothership would still push updates to end users to ensure everyone was using the correct protocol. Decentralized protocols do create some technical challenges, but they also alleviate others (scaling, for example).
Even when that's not the case, like with bittorrent, there's tons of innovation. DHT Protocol, for example, was a game changer.
You’re missing the forest for the trees… Skype is a corporate product run and owned by a single entity that is free to change their tech however they please. Something like email, a federated and distributed communication protocol, is incredibly slow to change because the stakeholders are numerous and the protocol is entrenched.
You point to DHT as a game changer, and yet that protocol was invented in 2005 and took over a decade to hit mainstream success and hasn’t changed much in the past decade. New protocols can be invented at any time but getting them adopted is a Herculean task with little monetary profit and updating said protocol in any innovative way is nigh impossible.
It is my (largely uninformed) opinion that the greatest barrier to an open web is the difficulty in securing a static address for a device that remains consistent regardless of the currently connected gateway address. So much of what third party web services actually do is simply offering a static address for routing traffic.
... huh. I had not thought of it that way, but you're absolutely right.
Not even a DNS thing. The entire concept of third party messaging platforms is because we can't just send data directly at another internet user unless they've done all the requisite setup to reliably receive it.
Yes exactly! My yardstick for the "spiritual" health of the internet is how easily I can have two devices across the internet communicate without a service in the middle. It shouldn't be difficult in theory.
I _can_ send a file from my phone to another friend's phone via ftp for example, but only if both are on WiFi, have port forwarding turned on on the router, and either just have the most up-to-date dynamic IP stored, or are using dynamic DNS, or have a static IP allocated from their ISP, which is less common at least here in the UK.
It depends what decentralization layer your talking about. Technically all these centralized / proprietary channels rose from the underlining net neutral decentralized Web.
I think higher layer "decentralized" protocols may allow a lot of people / organizations to compete but a similar winner takes all network effect will likely take hold relative to the fundamentals of network centric bell curve.
The marginally better content / service wins out exponentially as it gains value as more people in the network consume or interact with it. It's why small percent of YouTubers, twitch streamers and blogger, app store apps take home hugely disperportional percent of revenue on these services. The same would be true for "decentralized" platforms. Maybe more so as there would be no organization to build, invest and support the farm league.
I still believe the P2P version of Skype was better. Centralized Skype may have more features, but the P2P version of Skype was better at the core functionality of video and audio calling. It was also more private and secure, since your calls were not routed through a central server.
The change to centralized servers was done because more and more Skype users (P2P nodes) were/are running on bandwidth and processor constrained mobile devices.
It also just so happened to align with the acquisition by Microsoft and resulting support for “lawful intercept”.
One major reason decentralized has a hard time is because standards, public protocols cannot be innovated on rapidly. Or at least, new iterations cannot be deployed universally very quickly. So centralized / proprietary channels rise to the top because new ideas can find expression in them more rapidly.