Q: Could/should the internet be converted into a P2P network model? By this I mean individual web pages are seeded/"leeched" as needed, instead of downloaded from a single source for every individual browser. How would this be accomplished? Would this have any advantages or disadvantages?
A: I can't answer the question, but I'd like your help with it.
Firstly, forgive my lack of knowledge on this question. I'm a mechanical engineer by trade, but have recently seen the tides shifting more and more to the tech end of the spectrum in my career opportunities. While I'm not too worried about my job right now, I'd be appalled if I became obsolete, which is why I'm currently trying to put myself through the pain of acquiring computing skills. Just in case. So imagine this as a teaching exercise: I'm just looking for knowledge, and hopefully someone here can enlighten me and perhaps point me in the right direction for more information. This seems like one of the more intelligent/mature places I've come across on the web, so I hope this question is as "deeply interesting" for you as it is for myself.
In my line of work, if we have an excellent solution to a problem (in this case the problem is data transfer, and the solution -at least for large clumps of data- is p2p file sharing) we go to that solution first, and see if it is applicable to the task at hand. It seems to me like these torrents are the fastest/best way to handle large files, so why couldn't they handle websites themselves?
Again, I'm quite ignorant on the actual inner workings of the protocols which actually run the web and organize the traffic inside it, so you'll have to forgive me if my questions betray this ignorance. So links to articles, research, textbooks, etc, related to my question would be greatly appreciated. And if you believe this is a question worthy of discussion, please add to that as well. Thank you.
Most sites these days use some sort of CDN which gets the data to you probably more efficiently than a P2P network.