Hacker News new | past | comments | ask | show | jobs | submit login
World’s Oldest Surviving Torrent Still Alive After 15 Years (torrentfreak.com)
533 points by lainon on Sept 29, 2018 | hide | past | favorite | 203 comments



It's astounding to me that BitTorrent isn't used more often for media distribution. There are entire industries built around distributing content from huge datacenters, and meanwhile the economics of distributing something through BitTorrent are exactly the opposite of how things traditionally work - content is easier and cheaper to send the more people are interested in it!


It’s mostly a privacy issue. For example, if Netflix used P2P, you’d be able to track which IPs had parts of which movies and start profiling Netflix customers. That would be bad.

The problem with P2P is that you have to make what you have public. That’s fine for a game update where it’s all the same, but for movies there are huge privacy implications.


The issue with using it for Netflix isn't privacy, it's that bittorrent is simply not any good for streaming video content when compared with current technologies already used by streaming broadcasters.

* It's slower to deliver video since you're reliant on upload connections of end users

* It's less reliable (same reason as above)

* You have no guarantee that chunks will arrive in the right order to actually stream data (ie the end of the file might download first)

* You cannot send multiple different video qualities in the same stream and have the client dynamically pick the right bitrate for their connection (this is what current streaming services do and why you often see Netflix / Prime video / etc switch between quality mid-stream without having to restart that stream. I can write more in this if people are interested)

* It's harder to debug network problems if you are experiencing issues with video quality (been in enough stressful emergency meetings with network guys over the years - I can't imagine how much worse it might have been if we couldn't do a full end to end trace of the delivery)

* Time to start playing a new stream is longer (which end users might care about)

* It couldn't support live services where the video data is being generated near real time

The current methods for video delivery are actually really good, bittorrent would be a major step backwards. However for delivering other kinds of assets - such as patches to computer games - protocols conceptually similar to bittorrent are used.


> It's slower to deliver video since you're reliant on upload connections of end users

Doesn't really matter, it is fast enough. If it isn't you back it up from a datacenter.

> It's less reliable (same reason as above)

Shouldn't be any real difference in reliability.

> You cannot send multiple different video qualities in the same stream and have the client dynamically pick the right bitrate for their connection (this is what current streaming services do and why you often see Netflix / Prime video / etc switch between quality mid-stream without having to restart that stream. I can write more in this if people are interested)

Sure you can. You only have to have a small map between play time and file offset for the different streams, the client will then pick whatever it wants.

> You have no guarantee that chunks will arrive in the right order to actually stream data (ie the end of the file might download first)

The client can decide this. There are torrent clients that do this already. Buffer 2 minutes, if some chunk is missing when 20 seconds remains pull it from a datacenter instead.

> Time to start playing a new stream is longer (which end users might care about)

No, you start streaming from the datacenter.

> It couldn't support live services where the video data is being generated near real time

Not sure, but if a minute is acceptable delay (depends on what is being broadcasted) it should be feasible. A live webcam for a tourist resort should be fine, a sports event, maybe not.

Spotify even used to work this way. When a user clicked play (or seeked to a different part of the song) the first (if I remember correctly) 15 seconds was fetched from their CDN. After that it used its own torrent-like system to continue and pre-fetch the next song.


> Doesn't really matter, it is fast enough. If it isn't you back it up from a datacenter.

Sorry but no it really isn't. You'd have to rely so much in your own data centre that you'd loose the benefit of bittorrent. Plus many services will use a commercial CDN, none of which current support bittorrent. So you'd end up having to build your own infrastructure there, which would be a great deal more expensive.

> Shouldn't be any real difference in reliability.

Home connections might drop off due to power cuts, router failure or any of the other numerous conditions that datacenters battle against. Home connections might get throttled by ISPs.

Where I work we offer 5 "9s" of reliability on some services, good luck asking consumer broadband to offer the same ;)

> The client can decide this. There are torrent clients that do this already. Buffer 2 minutes, if some chunk is missing when 20 seconds remains pull it from a datacenter instead.

2 minutes is an excessively long buffer compared to.how long most RMTP segments are (often 15 seconds or below).

> No, you start streaming from the datacenter.

So then why bother with bittorrent at all?

> Not sure, but if a minute is acceptable delay (depends on what is being broadcasted) it should be feasible. A live webcam for a tourist resort should be fine, a sports event, maybe not.

A minute isn't even close to acceptable delay. Imagine if you're watching a sports match and Twitter is ahead of your video fees, you wouldn't be grateful for the spoilers. This is a real world problem with video streaming and why they talk about getting the latency down to 5 seconds or less.

> Spotify even used to work this way. When a user clicked play (or seeked to a different part of the song) the first (if I remember correctly) 15 seconds was fetched from their CDN. After that it used its own torrent-like system to continue and pre-fetch the next song.

Spotify is audio only. People love to compare video streaming to audio streaming but they don't realise that HD video is an order of magnitude more complex to stream - in terms of bandwidth, syncing, dropped frames, etc.

It's one of those problems that seems easy to solve from a superficial level but when you start getting to the broadcaster level it's actually a great deal more complex than even Plex and other self-hosted streaming services would lead you to believe. (Disclaimer, I've worked at the broadcaster level)


I just can't address this further. I'm not convinced you have the slightest clue what bittorrent is (how on earth does not having 5 9s matter for a consumer connection? If a node drops noone will notice).

I'm not saying there won't be problems. A major problem is the asymmetric nature of many consumer connections. Not only is the upload often a fraction of the download (that is the easy part), but the download speed can be greatly sacrificed if upload is utilized. Add to that issues that home users might want to use the connection for other stuff.

The nightmare and confusion surrounding "I can't game because someone is watching torrent-tube" will be real and add to that issues with ISPs that have a fixed quota or people on mobile (or tethering a mobile connection to laptop). Netflix et al would not like to deal with that kind of FUD spreading around.

And all this is before considering local ISP bottlenecks as it isn't what the network was designed for. A vastly superior option is to put a proxy on the ISP network itself (but yes, hard to do for small players).

Those alone are probably a magnitude worse than the technical issues you speak of. Even spotify got much flak for it, and no, I don't think anyone on earth think the bandwidth requirements of audio and HD video are similar.


I actually did cover that point (albeit at a much higher level) and of course I know what bittorrent is. Just because I don't agree with you it doesn't mean I don't understand the problems. Quite the opposite, I happen to work in the video delivery industry so it's my job to understand these problems.

Like I said, it's ever so easy being an armchair critic but try deploying this stuff at scale with SLAs covering 5 "9s" and paying customers then tell me you've got all the kinks in bittorrent solved ;)


Arguably, no, you contrive issues that don't make any sense. Such as presenting the issue that the end of the file might be downloaded first. Or that you can't have different quality versions and dynamically switch.

The end will only be downloaded if the client requests it, which won't be unless the user is watching the credits...


The client matters though. Even just getting that client onto your end users systems is going to be problematic. And even those bullet point aside I raised other issues you're yet to reconcile. Plus there are other problems I'd not considered (eg RMTP is often sent over UDP and bittorrent is not only TCP but has additional CRC checks that you don't really want with video streaming).

But as you're clearly an expert in video delivery you should build a torrent based video delivery platform. You could make a killing (assun. Or alternatively you might discover these opinions you were dismissing might have had a point and actually video delivery is a lot harder than you first assumed.


Like Popcorn Time?


I know it can be done via bittorrent. Anything can be hacked to do anything if you're motivated enough. However the question asked was whether it's good enough for paying customers in a commercial environment (ie Netflix). That's a very different set of expectations.


Popcorn time when I looked at it has consistently given me performance at least as good as Netflix (which tends to downgrade quality for bad reasons - I have very fast fiber and my ISP has good peering - and has a client that can't seem to stop desyncing audio). The only reason not to use it is arguable legality and the fact that content creators deserve money. It's definitely not quality of service.


If you're on a really fast internet connection where your ISP doesn't throttle P2P (many do in the UK) then I wouldn't be surprised that you'd get good results from Popcorn Time. The issue is whether that is consistently true for people on slower internet connections, connections with QoS or other traffic shaping methods against P2P, and those who stream on lower end hardware (old laptops / phones) or embedded devices (Amazon Firestick or smart TVs).

This is where RMTP streams really shine through - they offer the performance (at the client level) and flexibility (quick swapping between bitrates within 15 seconds of chunked video) to maximize the video quality per client.

As for why Netflix doesn't look to great on your connection, I can only speculate there but the reasons might be more down to congestion on your local Netflix cache server or your ISP itself than RMTP streams being inferior to BitTorrent.


I have a good provider that gives me what I pay for, yes. I understand that many countries have a borderline third-world internet situation, but that has very little to do with the problem you were presenting. What you're talking about here is essentially that Netflix has the lobbying power to make sure that it's services are put above other services in countries vulnerable to that kind of manipulation. I believe that instantly.

I can understand what you're talking about, but I think it's important to appreciate that as an end user I don't really care about what tech words I can tell myself about why things look bad. I care that one thing looks bad and the other does not.


I don't think you understand point. I'm not just talking about "borderline third-world internet" situations. I'm talking about watching stuff on embedded devices, on public WiFi, on older harder or on your cell phone while out and about. I'm also not talking about Netflix's lobbying power.

My point is quite simply that RMTP is already a better protocol for video delivery than BT.

Yes Popcorn Time exist and in some situations it is comparable to Netflix, but not in all situations. In fact not even in most situations.

BT wouldn't allow you to deliver to other providers such as YouTube or Twitch (this is something I was working on last week with RMTP streams). BT wouldn't allow you to dynamically inject ads separate to the video feed (this means you'd need to encode your adverts into the video file so you cannot charge different rates for different days or viewing times - which is a deal breaker for most broadcasters). BT wouldn't work for live feeds (so it would be useless for sporting events - which is what generates a large chunk of broadcasting revenue). BT is noisier than RMTP at the ISP level (ie it actually costs ISP more bandwidth not less - and given how much many ISPs are already complaining about Netflix et al, the last thing you need to is expect those ISPs to offer Netflix free bandwidth in for the form of user seeding).

As you can see the point I was making right from the start isn't that you cannot stream video via BT but rather that there are already better protocols for video delivery than BitTorrent. Hence why professional video delivery platforms don't use BT.

I suspect the issue many commenters on here have is they assume that traditional video delivery is still a classic straight stream of data like the old days and like people are familiar with when downloading via HTTP or FTP. But that's simply not how RMTP works. Modern video feeds are actually chunked just like BT is, except that it's feed from a CDN rather than peered from end users.


Client matters, yes, that hasn't even been brought up in the discussion.

Never stated I was an expert in torrent or broadcasting. But barely one who has used bittorrent can counter most of your arguments. As I've mentioned there are tons of issues (and as mentioned elsewhere in the thread, privacy alone would make it a non-starter for many). I don't believe in it for large scale, but not for the technical issues you presented.

Edit: Apologies if I came off too harsh, been a rough couple of days.


Except you only managed to disprove one point, every other counter argument you gave I was able to offer problems with.

I honestly think privacy is the least of the problems. A bigger issue is just getting content owners on board to begin with

Edit: sorry to read you've had a rough few days. Hope things improve soon


They haven't explicitly 'disproved' all of your points because as mentioned previously most of what you said is nonsense.

(Someone already mentioned this, but seriously - this has been done - it works really well - as a polished commercial product it would probably work even better: https://en.wikipedia.org/wiki/Popcorn_Time)


I've already addressed Popcorn time several times in this discussion.

The fact that you've acknowledged that it's not as polished as a commercial product should be a big hint that perhaps commercial products choose RMTP over BT for a valid reason. Perhaps even for the reasons you initially dismissed as nonsense. Perhaps you might need to read up a little more about how professional video deployment actually works before you assuming you understand video deployment better than all those engineers who do this shit for a living. Just because something conceptually works it doesn't mean it's any good when dealing with the expectations of paying customers who might want SLAs of 99.999% uptime, low latency live services (like less than 10 seconds), near zero buffering times etc.

The Dunning–Kruger effect is overwhelming in this thread but trust me when I say video engineers are not stupid people and if their lives could be made easier by using BT then we would definitely be seeing commercial uses of BT for video deployment.

https://en.m.wikipedia.org/wiki/Dunning%E2%80%93Kruger_effec...


I thought I pretty much disproved every point you made. Here are some further comments but honestly the issues you bring up doesn't make much sense.

> If you have to rely so much in your own data centre

Not sure if we are talking 15 second gifs or full-length films here. The thread started with netflix as an example and that's the context I've assumed. The startup-cost on a full-feature film or even a series episode is barely a fraction in the context so I really don't get your comment here.

But I would go even further and expect constant backing from your data center to help with distribution, depending on how incentivized people are to keep the client alive after they've seen something. I would also not, both for network performance and common decency+courtesy, try to squeeze everything out of my customers internet connections.

> Home connections might drop off due to power cuts, router failure or any of the other numerous conditions that datacenters battle against. Home connections might get throttled by ISPs.

None of that applies here, that's a huge point of bittorrent to be resilient against that so it just doesn't make any sense to bring it up. A datacenter is centralized so it needs to deal with it. Thousands of clients spread throughout the world will likely even have better reliability than a datacenter. All in all, no reliability lost. Throttled by ISPs are a real issue and I really highlighted those issues earlier.

> 2 minutes is an excessively long buffer compared to.how long most RMTP segments are (often 15 seconds or below).

I just made that number up, but yes, a torrent solution will need a larger buffer. That really isn't a problem in this context. What is a much bigger problem is the storage needed for seeding depending on how you want to solve that and how reliant you want to be on your clients (again, your points doesn't make sense, why complain about the buffer when the cache is orders of magnitude worse).

> So then why bother with bittorrent at all?

I wouldn't, and I never said I would. But the protocol would work for it, consumer ISPs and other factors might not for such a large scale operation. And the privacy implications are also bad, it would leak information to your competitors as well. Bandwidth costs clearly aren't that big of a deal either so the potential gain isn't worth it (talking netflix scale here). Also proxys at ISPs are a superior solution.

> A minute isn't even close to acceptable delay[...]

You can not possibly have read what I wrote.

> A bigger issue is just getting content owners on board to begin with

Content owners are notoriously irrational so I wouldn't argue against that. But torrents could still be encrypted.


I suspect some of the issues you're having understanding my comments is I've assumed you had a working knowledge of RMTP.

Basically the protocol comprises of multiple streams of various bit rates. These streams are chunked into blocks of n seconds. This is often around 15 seconds or less. The client will hot swap between each stream if one chunk takes too long or downloads too quick. This ensures that the video quality dynamically scales to the best bitrate your bandwidth will handle and it does so quickly and with minimal buffering. It works for video on demand services but it also works for live feeds as well and the whole thing can be served over multicast or UDP for further savings at either the DC or across the wire.

Bittorrent is nowhere near optimised enough to compete on that level. The 2minute buffering you guessed at would easily look terrible in side by side comparisons with RMTP deliveries. Not to mention a whole boat load of hardware encoders (rack mountable gear used in broadcasters) are built for spitting out RTMP streams yet none (at least that I'm aware of) exist for BT. So you'd have to build your own gear as well as content delivery network. That alone would cost you far more than any savings P2P deployment might earn you.


That is a good point. I still argue you can do the same but with worse results with a torrent solution.

If you have 2 minutes of buffer but realize you can't keep it up you can still switch to a lower bitrate, that will allow you to get any missing chunks and maintain the buffer going forward. So you can still adapt (from the networks point of view much quicker than the 2 minutes), but not nearly as fast and the end result (quality change) might lag quite a bit.


> Never stated I was an expert in torrent or broadcasting. But barely one who has used bittorrent can counter most of your arguments.

The way you discussed this came off as incredibly arrogant, so you shouldn't hold your hands up all of a sudden and claim you're being misrepresented—I personally expected you were a bittorrent developer disgruntled over lack of adoption for non-technical reasons or similar based on the vitriol and self-assuredness.

There is a way to point out that some of those things should not theoretically be a problem without acting like they have no practical merit whatsoever and that anyone who believes so is an idiot. When you cite audio streaming you especially make yourself look like a Dunning-Kruger case as delivering adaptive HD video involves many complex issues that are completely non-existent or trivial for audio. It's hard enough with full control over your own PoPs, and layering on P2P only makes it more complex. Content rights / privacy aside, you could never rely on this as primary delivery for commercial content, so in effect you'd be layering a bittorrent system on top of a traditional CDN setup as a cost saving measure, but you've have to be damn sure QoS didn't degrade in the process which is also non-trivial. Just because it's theoretically possible doesn't mean it's a good idea.


There is a way to point out that some of those things should not theoretically be a problem without acting like they have no practical merit whatsoever and that anyone who believes so is an idiot.

I felt the original message had the same tone and thought to answer with the same. I shouldn't have. And I went further. And I just couldn't stop. I shouldn't be commenting when I'm this frustrated (unrelated to this thread).

Again, I'm sorry.


Nah you're fine. I'm getting serious "ideological/political argument" vibes off the other commenter.

I recognize the other commenters tactics as the same "throw a bunch of dubious claims at the wall, force your opponent to spend way more time refuting each one than it took you to make them, and as soon as they look like they're climbing out move the goalposts again" thing that I see in political arguments with half drunk friends and strangers on political subreddits :P

I'm sure I've seen a great webcomic that describes this somewhere but I can't find it...


Your only point was popcorn time and that stacks up poorly against a good RMTP stream.

The problem we have here is that a bunch of hackers have seen it can be done with BT (which nobody is saying it can't be) and then assuming that BT is suitable for someone like Netflix. If it were that clear cut then we would already be using it. But the fact remains BT doesn't actually stack up that good against already established technologies we use for video streaming.

I'm not saying this for the sake of an "ideological/political argument", this is a well researched point I've discovered doing my professional day job.

It's ever so easy to hack something together that works for a bunch for non-payment customers who just want to pirate something. It's a whole other ball game to build a production service with advertisers (where applicable) and other partners and sell that to paying customers.

What you're doing is comparing a kit car to a bus and saying they're the same because they both transport people. Then assuming the kit car can transport the 40 people at a time for 12 hours a day every day because you've made the previous logical assumption of equivalence.


To your first point on speed: check out Popcorn Time. Even though this example is, likely, streaming copyrighted material - it showcases that BitTorrent is indeed fast enough to stream reliably at FHD quality. Bandwidth, even from a mildly seeded torrent, can easily push down hundreds of megabits per second. As others have noted place a few well connected, data center oriented, seeds and you have speed and distributed sources (reliability). Content caching boxes already handle the majority of Netflix traffic at the DC/head end of your ISP today. So Netflix could easily add BT to that caching layer and bake it in to their native apps (Android/IOS) as a first test bed.


If you're placing cache servers at the ISP then they will charge Netflix for the seeding bandwidth, which would work out more expensive than running traditional RMTP streams because the BT protocol has much more overhead (TCP vs UDP, CRC checks, tracker syncing, etc). And that's without factoring in the likelihood that the cost of bandwidth per GB from a CDN would be much lower than from an ISP.

So what would Netflix gain in doing so? They'd need to build new clients, convince content owners that P2P is secure[1], convince ISPs to seed content, pay for new infrastructure, etc. And the final product would be more expensive and _at best_ equivalent to what they already have - though realistically it will be worse on all but the top end viewing platforms (ie laptops and desktops). And for added bonus this infrastructure "upgrade" then removes the possibility of Netflix running live services and dynamic ad injection, etc.

So it's more expensive, less future proof, less performant on low end internet connections and wouldn't work on any of the existing smart TVs or streaming HDMI sticks.

Like I've said in other comments, BT is fine for hackers and pirates - for people who want to run something for free, willing to run it on a laptop or even willing to be inconvenienced a little for the sake of running something different. But it simply couldn't compete once you start scaling it out as a commercial platform with all of the customer expectations from non-techies and the unseen requirements / complications that happen with professional video delivery platforms behind the scenes. Plus for all of this talk about how good BT could be, people seem to miss the key point I made right from the start: that existing delivery platforms actually aren't that bad.

[1] I actually have no qualms with the security side of P2P in terms of protecting copyrighted content. But content owners are notoriously slow to adapt to change and hard to please. Getting them on side might prove an even tougher problem than engineering BT to compete with RMTP at the commercial scale.


RMTP license is pretty interesting as it among other things prohibits circumvention of "technological measures for the protection of audio, video and/or data content"

Also things like ad injection are not features for the end user.

RMTP is a proprietary protocol developed by Adobe so it's a good fit when you want to screw your customer with DRM and other stuff.


I don't disagree with your points however it's content owners who decided if content should be DRMed or not, not the end user. Thus you'd find videos delivered via any other protocol (Inc Bittorrent, if that ever became a commercial platform) would also be subject to DRM.

Similar point for as injection. You might consider it anti-user however in many cases ads are what pays the bills and thus in traditional "over the air" broadcastering it's actually more important to avoid down time during as as spots than it is during a programme airing because the cost of compensation is greater.

Ultimately in professional video delivery you'd have more than one type of customer; obviously you have the consumers watching the content but there is also the people paying for ad slots. Plus in many cases you wouldn't be the content owner so you'd have a third customer who is the person paying for your services to deliver the video.

So the features you might consider to be anti-user (and to an extent I don't disagree with you) are still there for paying customers. Or to out things a different way, video engineers would prefer not to add the complication of DRM and as injection into their infrastructure if they didn't have to.

As for RMTP being Adobe owned, there are other video delivery protocols around but RMTP tends to be the preferred platform for B to C (broadcaster to consumer). Or at least that is the case in the places I've worked. Eg it wouldnt surprise me if Apple / iTunes did their own thing.


> Plus many services will use a commercial CDN, none of which current support bittorrent.

Actually S3 can deliver blobs through bittorrent: https://docs.aws.amazon.com/AmazonS3/latest/dev/S3Torrent.ht...


Interesting. I wasn't aware of that. Thank you.

The 5GB limit would mean you couldn't use it for movies but TV shows should be fine


Your argument is completely valid in the current context, which is the only that matters. Just FYI, In other contexts these points are more nuanced:

> It's slower to deliver video since you're reliant on upload connections of end users.

This has been scaling really well for small unfunded projects that need to deliver big binaries, such as open source distributions.

> It's less reliable (same reason as above)

If you have large corporate or state level actors trying to censor the content, then this becomes the more reliable method.

> You have no guarantee that chunks will arrive in the right order to actually stream data (ie the end of the file might download first).

This is not fixed by the protocol, but by clients optimizing for total download speed. I think some clients allow you to download more sequentially.

> You cannot send multiple different video qualities...

Video torrents often come in 720p, 1K and sometimes even 4K versions. So you could pick in advance. But that's not dynamic mid-stream like you mention.

Since you offered to write more: Is there such a thing as a progressive codec for audio/video? I vaguely remember video chat codecs doing something like this, where you get progressive degradation if you loose packets.

> It's harder to debug network problems...

oh yes...

I think this is the main reason Skype and Spotify moved away from the P2P tech that they used initially. I hope this becomes less of an issue when projects like LibP2P and IPFS mature and become as reliable and common as TCP/IP and TLS.

> Time to start playing a new stream is longer...

You can solve many of the issues mentioned above by having high quality sponsored seed servers. This is basically a hybrid approach. The seeds would guarantee good services and the P2P part would reduce load on the seed servers where possible.

> It couldn't support live services where the video data is being generated near real time

Not Torrent, but there's community of P2P television streams, quite popular for big sports matches. And from what I hear great quality. This is a completely different P2P model, more like a multicast stream. Can not do the sort of on-demand streaming that Netflix would require.


I already acknowledged that torrents are good for some non-video applications. :)

I just can't ever see it being practical for video deployment. At least not in its current guise (thus whatever P2P model might succeed in the future wouldn't be authentically bittorrent).

If anything, with AWS and other "clouds" offering services for video delivery (Media Live, CloudFront supporting RMTP etc), I can see bittorrent becoming ever more irrelevant for video distribution. At least thats the trend Ive been observing recently. But tech can move in unpredictable directions so never say never.


It's all about economics.

Yes, torrents have some downsides - forever they can be worked through at some cost of quality. I might see some delays occasionally for example while I wait for an it is sequence chunk to arrive.

However for some content/use cases, perhaps that will be acceptable.

Right now the economics favor a dedicated and reliable changes from the source - e.g Netflix.

Perhaps one day another use case will appear where the massive cost benefits of piggybacking off other people's resources comes into its own.


I've done some work on with broadcasters distributing video assets over IP and honestly I can't see bittorrent ever being used. Once multicast over RMTP becomes reliable (which broadcasters are working towards) then the one advantage bittorrent has (reducing bandwidth from source) becomes moot for live services. Video on demand services would still have bandwidth issues but its less "spikey" and you'd factor that into the running cost of the service (and thus what you charge customers / advertisers).

But I guess "never say never?"


Well, do you think that BT might have the potential to reduce bandwidth costs? And what difference might that make to the overall cost of doing business?


I think ISPs might quickly stop P2P video delivery if it became the norm because there's no way they would want to pay for the bandwidth. Or those costs will just get passed onto the consumers. Either way, it's not free bandwidth.

In terms of video streaming services, there's other ways to reduce bandwidth costs without resorting to P2P. Multicast being the prime candidate as that's where you send one IP packet which targets multiple different endpoints (rather than unicast which is the typical point to point method of IP comms)


Hence the reason why bitchute.com(uses Webtorrent protocol) is so slow.


> It’s mostly a privacy issue.

There is a simple way to fix that. You wouldn't want to force everyone to upload anyway (some are on metered connections etc.), so what you would do is offer a nominal discount for uploading, but if you choose it then you don't just host what you watch, you host whatever needs more hosting capacity. Then random third parties can't tell what you watch because what you host is random.

This reduces the efficiency somewhat, but it's easy to minimize that: Don't give someone a hundred movies to host, give them one movie and have them host it a hundred times (over the course of a year).


residential customers have really poor upload speeds so how many customers would you need to have to replace 1 caching server collocated at an ISP? short of having FTTH, most customers would have < 10Mb/s up. a dedicated server can have multiple 10Gb/s links. you'd be looking at hundreds or thousands of customers per caching box. the "nominal discount" you're offering would have to be tiny for it to make sense financially. so much so that most customers wouldn't even bother.


If the nominal discount was $1/month, most customers with an unmetered connection would take it because it's effectively free money. From the company's perspective it could even be revenue positive by raising the price $1 and offering it back as the discount.

You're also not considering the power law distribution. If nineteen customers have 1Mbps uploads and one has symmetric gigabit, the average is >50Mbps. And the discount could be per-byte -- set it at half whatever the cost of central servers would be.


Watch how quick those unmetered connections become metered once this takes off :P Companies doing the distribution value control, this would be quite difficult to administer. People put up with flakiness only because they're getting something they shouldn't be, or because it's free anyways. Once people start paying, it becomes a very different set of expectations.

Put on your product hat: as Netflix, you want to control every aspect of the experience to ensure its absolutely seamless. When you outsource distribution to a huge set of third parties that becomes much harder to achieve.


My solution to this for a video streaming service is to have a master network that you control that feeds the data needed to cover the next 30 seconds of video playback, with future segments coming in via torrents. If any segment is still downloading once it hits the 30 second threshold the connection is dropped and that segment is downloaded from your more than capable network.

Connections that are known to be slow can seed just the ends of files with faster customers seeding the entire file.

I imagine the maths on this has been done and the cost of bandwidth is much lower than the cost of engineering the solution though.


Update the protocol to securely track who uploaded what, to whom, and you've got a market just like private solar panel owners and the energy grid. Pay the 1mbps and gigabit civilians proportional to how much they share.

It's entertaining to expand the idea beyond bandwidth...what about CPU/gpu time on my powerful consumer desktop sitting idle/off all day? Or storage space? The computer would need to be on to seed... Would people pay to also use the idle compute capacity?

Security is a problem, but it could be used to segment customers and providers. (Big corporations bidding for AWS capacity, tiny personal sites hosted on a CDN of cheap consumer hardware).

Tagline it "Uber for the cloud"!


Filecoin[1] is doing that for storage+bandwidth and Golem is pursuing the CPU/GPU time idea.

[1]: https://filecoin.io [2]: https://golem.network


>most customers with an unmetered connection would take it because it's effectively free money

not really, because have to pay for the electricity costs of leaving your computer on 24/7. assuming a modest 50W power consumption for a PC, that works out to 36KWh per month, or $3.6 at 10 cents /Kwh. That alone wipes away your $1/month you're saving. also, that's a third of the monthly plan cost, how much of a discount can netflix really offer here?

as for hard disk space, I'll grant it's "free" space (as in the consumer wasn't going to use it anyways), but running consumer drives 24/7 with frequent seeks can't be good for them.


There is no need to have the consumer PC running 24/7. Also there is no need to have the same uplink speed as downlink speed. Consumers on average just need to reach a seed ratio of 1, while downloading and consuming the content and then possibly doing other stuff on their devices until the content is deleted. Actually, if they only reach 0.3 on average, then still 30% of the server infrastructure is saved, which is already a lot.


How many people turn off their computer regularly? I suppose people with laptops probably do. I leave my computer on 24/7/365.


Most people that are not in IT.


"Approximately 50% of computers are in On mode for 4 to 5 hours or less."

  https://ees.lbl.gov/sites/all/files/computers_lbnl_report_v4.pdf
I wouldn't have guessed.


I have unlimited 300 Mb/s upload and it's nothing unusual in my country (Kazakhstan). I guess it depends on country.


I gave 200MB/s down and 10MB/s up. I suspect this is more normal.

Yes, it depends on country. But I also suspect the total upload bandwidth for typical users in typical locations simply isn't there to make the idea work at scale.


What does the ratio of customers to caching boxes have to do with the size of the proposed discount?


because if you're offering customers $1/month discount to use their computers as cache boxes, but dedicated 1 caching box is equivalent to 500 customers in terms of capacity, then if the amortized cost of 1 caching box is less than $500/month, it's cheaper to go with the dedicated cache box than to pay customers to run cache boxes.

(all numbers are hypothetical)


I’m watching netflix on my chromecast or smart tv. There’s no storage to speak of. I’m guessing that only PCs would be suited for what you’re suggesting, and that very few people watch on PCs.


For a company like Netflix, the thing to do there is to make your own such device and then you can put whatever hardware you want in it.

I'm actually slightly surprised that nobody at Netflix has already done that. Netflix Steambox would be a thing people would pay money for, and then you don't have Google/Apple/Amazon/Comcast/etc. diverting your customers to their video services because your customers are using their devices.

Also notice that you don't need 100% of users to upload. Having 10% of users with a 5:1 ratio still cuts your bandwidth costs in half.


There is already a fix for it. P2P over the I2P network.

https://geti2p.net/


I don't think Netflix subscribers are as worried about privacy implications of P2P as Netflix is(not interested in sharing data). Netflix already has all history data of all subscribers and can decide to sell/use it as it seems fit thought policy changes. As subscribers, it's no different to have our data visible to one profit driven company vs to all subscribers. Moreover, having this data available publicly would of immensely help in creating transparency among people and bringing then closer.

World would be a lot better place if we starts becoming comfortable with sharing our real thoughts/preferences openly, and it would simplify transition to truly public-sprited internet platforms based on etherium, etc.


It's a shame that so much modern technology ends up betraying the average consumer in ways they can't understand.

I can understand the hesitation. It's warranted, but there is progress such as this very threads response.


I was of the opinion that Facebook used P2P / torrent protocol.

>> According to an interview in 2012 with Chuck Rossi, a build engineer at Facebook, Facebook compiles into a 1.5 GB binary blob which is then distributed to the servers using a custom BitTorrent-based release system.

Above part is taken directly from Wikipedia article on Facebook - "Technical Aspects" section.


Twitter does too - or did - under the name Murder.

https://blog.twitter.com/engineering/en_us/a/2010/murder-fas...


Facebook uses it, internally, for code/image deploys/updates.


I think it should be possible to build something akin to Tor (or even use the Tor protocol directly) that proxies requests and responses through intermediary nodes, so that the sender doesn't know who's the receiver and vice versa.

But you're probably right in that this is a relevant problem, as IP addresses are considered PII (in Europe at least), so leaking them could be problematic. That said, as Netflix already has a contract with each of its subscribers it shouldn't be too difficult to modify it to accommodate that kind of data sharing.

I imagine the real problem is ensuring streaming quality in such an infrastructure, as it's very hard to predict when a node will go down (e.g. user turns off laptop) or massively degrade in bandwidth (e.g. user starts downloading other content), and there's not much that customers hate more than interruptions when streaming movies (as you can probably attest firsthand). So given all these constraints it might be cheaper just using a CDN. Also, the bandwidth costs per customer probably make up a negligible share of overall costs (licensing is probably the biggest part) so it might not be worth optimizing this at all. I'd estimate that when I used Netflix I streamed maybe 20-30 movies / episodes per month (sometimes more), which would total at about 50-100 GB max (?) given the right encoding. For a company like Netflix that distributes probably petabytes of data a single GB should cost significantly less than 1 cent, so my bandwidth cost to Netflix would be between 0.5-1 $ per month (I even think it's much less probably).


The Tor network has little capacity and it's abusively slow to use for normal web browsing. The experience for video streaming will be abysmal.


It's usually fast enough to watch YouTube, capacity has gone up a lot in the past few years IIRC.


I'm not saying use the Tor network but rather implement the protocol on a custom, provider-specific infrastructure.


I believe your interlocutor is suggesting that the Tor protocol makes tradeoffs in its design that tend to make it slow. This is demonstrated by slowness in the largest Tor deployment to date.

Also, have a large and single-purpose Tor deployment controlled by a single provider would mean there would be no privacy from that provider and anyone able to send them subpoenas. I think that this weakens the privacy provided to the point where the decision to use Tor at all becomes questionable.


Sure but the goal of such a system is just to distribute content more efficiently so protecting the users from the provider of the system is not the goal, you just want to protect the users from each other, if you like.


In that case, I'm afraid I don't see any advantages over a traditional CDN. Certainly not any privacy advantages, almost certainly not any performance advantages, and probably not any cost advantages either.

Is there something I've missed? Can you help me?


Doesn't (or didn't) Spotify do this?


Yes - there's a paper documenting the approach and metrics here. https://ieeexplore.ieee.org/document/5569963 .

It was removed in 2014 in favor of central distribution.


Why did they switch back to central distribution?


I don't know the real reason but for myself, I actually uninstalled Spotify after I found out about the P2P feature. There was a noticeable lag when I would play online console games and had Spotify open on a PC or laptop. It was never really an advertised feature and it kind of pissed me off that they would do that without a clear notification. Took me a bit to troubleshoot as well. I'm sure it was in the user agreement but come on, who reads that.


Interesting. Could there be a more privacy-friendly protocol involving cryptography? The Zcash cryptocurrency managed to hide transactions with zero knowledge proofs for example. It also reminds me of some cryptocurrency who asks other nodes about a bloomfilter of data, allowing them to receive what they want but more stuff to hide the real request.


It's a privacy issue either way. How long until shareholders demand that Netflix sell user data?


Right, and the data is actually made more valuable by not being public with access controlled by a company like Netflix.


The BBC trialled peer-to-peer distribution for their early iPlayer.

People were very upset about use of bandwidth. This would have been around 2007 when most Internet accounts in the UK had download limits.

https://en.wikipedia.org/wiki/BBC_iPlayer#Computer_platforms


I remember thinking it was very brave of them at the time, with BitTorrent being branded as a villan then just as much as it is now.

P2P is an interesting approach to large scale media distribution, but aside from the bandwidth issues, I imagine they were under immense pressure from copyright holders to steer clear of it.


BBC R&D even tried using actual bittorrent to deliver content (as part of an EU research project) a while back: https://torrentfreak.com/bbc-trials-bittorrent-powered-hd-vi...


World of Warcraft used torrents for update downloads right up until they replaced that with the new combined updater app a few years back.


Even when they used torrents, they had http mirrors implemented inside of the torrent client. In fact I believe Blizzard pioneered much of the http seeding work that I think eventually got standardized. Haven't kept up with it so someone can correct me...


My network link is asymmetrical - I have very little upload bandwidth compared to download. This is not uncommon.

BitTorrent is abysmal for me. It quickly saturates my uplink bandwidth, destroying downlink performance. Or I cap it, or it discovers a cap, and my upload rate is slow enough that I'm labelled a leech in the BitTorrent network and get a slow download rate.

World of Warcraft's updater was absolute trash for me. I always got far better transfer rates turning off P2P and sucking up the congestion on the direct download option.

BitTorrent simply doesn't work for most of the world's commercially available home links.


BitTorrent doesn't necessarily mean everyone seeds, nor that access is free.

If there was some kind of micropayments infrastructure for seeders/leechers/content owners, I would happily seed arbitrary-but-legal torrents with my 1000/1000mbps connections in exchange for credits towards media or cash to cover the connection ... While I imagine you'd pay to download using BitTorrent, with some tiny fraction going to me (proportional to resource consumption) but never seeding to others.


What you're talking about already exists and it's a joy — joystream to be precise [0]

[0]: http://joystream.co


The history of Joost is instructive. The cost of making swarming really work is now higher than the cost of CDNs. Octoshape seems to still exist in some form but I don't know if it still uses P2P.


See also Red Swoosh. Or the S3 bit torrent support. Hosted perr to peer distribution is/was a reasonable idea. But the adoption/efficiency/variability costs dont appear to be compelling compared to the “dirt cheap” HTTP CDNs of the last decade.


It's sometimes used in game downloaders/updaters. But some people could be potentially mad that their upload data is being used when they don't want it used. Or depending on implementation, that uploading is going on in the background when they expect nothing to be happening. There might also be problems with firewalls or NAT.


Very early on (10 - 15 years ago), we used swarms across 14 data centers to rapidly distribute the ‘hottest’ video files to all the video servers’ local disk caches.

Many companies back then tried to startup based on business model of doing this via end users as you suggest. It was mostly sunk by users’ pretty reasonable opinion they didn’t want to donate their upload bandwidth on their metered plan.


"Swarm technology" is heavily used in professional content distribution in the entertainment industry. I'll just leave it at that, I don't think my former employer would be happy if I elaborated more. :-)


Can you really not elaborate any more?


Maybe the parent means that movie you just watched on the big screen ...

Was actually a torrent.

Wouldn’t that be amusing. How would we know any difference?


I was under the impression movies were distributed on encrypted physical media which the key was made available on the release date. Maybe things have changed.


Nope, not at my local theater.

I recently went to a small theater to watch a movie. It didn't start. After 10 minutes I went to talk to a manager. They had forgotten to download the licensed movie for the day and that download would take a couple hours. I had to get free passes to come back another time.


Sounds more like they forgot to transfer the movie from their central server to the screen server. Normal workflow has been to get the physical media via courier to the theatre, staff then transfer the content to a storage server and then a courier picks up the film and delivers it to the next theatre.

edit: A normal DCP (digital cinema package) is around 100GB, bigger if it's 3D/4K/Atmos. Some of the biggest packages can be 500-600GB with all versions (2D, 3D, dubbed, hard of hearing, atmos, auro, DBOX and so on). So not really feasible to download that via the Internet in a couple of hours.


The films are encrypted, but the distribution media isn't necessarily physical. It depends on where in the world you are, but it has been pretty normal to distribute films via satellite, but recently Internet speeds have become fast and cheap enough to allow films to be distributed via the Internet.


I have no idea, I'm just speculating wildly.


Insiders regularly upload new films to private torrent sites.


It's astounding to me that BitTorrent isn't used more often for media distribution.

A perfect use for it would be to distribute VNR’s (Video News Releases) and other audio and video from public relations firms and large companies.

The big obstacle is introducing any new protocol into a large bureaucracy.

It took a company I work with eight weeks to get their artist an SFTP program for his computer so he could send material to the printing companies; and another month to convince IT to open the ports in the local network.

By then he’d just sent a bunch of DVD’s by messenger.


>A perfect use for it would be to distribute VNR’s (Video News Releases) and other audio and video from public relations firms and large companies.

Can you please elaborate a little bit? My business understanding of VNRs is that they are not in demand per se, and are pushed/pitched out-of-band to (hopefully) receptive editors and news directors. Is this true and what does P2P add to this?


Yes, and no. It depends a lot on the industry and the publication.

For example, if you're a consumer products magazine, you will probably want whatever VNR Procter & Gamble has for its latest dessert topping/floor wax. It would be massively simpler for P&G to provide a torrent link on its website that interested publications could just click.

Or, let's say you're a construction company that just built an awesome new bridge. A VNR with drone footage of the construction process on your website that could be downloaded via torrent with a click would be helpful for the TV stations in your market.

I sometimes work with real estate developers. At this time, they're all sending video, huge photos, and massive PDF books to print, TV, and web publishers via Dropbox. Some even use Flickr for press photos. Torrenting those would seem to be a better method.


I see - and thanks for the positively ancient SNL call-back! (dessert topping/floor wax). Being ancient myself, I spotted it right away.

Further question: I too work with real estate developers. You're suggesting they publish a link to assets where the browser depends upon a torrent application to complete download, no? Or, alternatively, did things progress while I wasn't looking and now torrents can be completed in-browser?


This was tried relatively unsuccessfully by a company named Red Swoosh (Travis Kalnik's baby before Uber). [1] The concept pre-dates BitTorrent, and the technology worked pretty well, I deployed it at scale. There were two big problems. First, users had to install the client which they didn't like doing. Second, users had to share their bandwidth and cache data they may not care about. In the real world, that turned out to be deal breakers, we had to drop Red Swoosh.

I think the larger picture here, is if your Netflix, you're happy to centralize, put distro points of ISP networks, etc. to provide a great user experience. How many people would be annoyed about Netflix dragging down their internet connection to make the service a couple of bucks cheaper?

[1] https://en.wikipedia.org/wiki/Red_Swoosh


> In the real world, that turned out to be deal breakers

In the US I presume, where ISPs are monopolies and probably had to be in on this to make it work.

> How many people would be annoyed about Netflix dragging down their internet connection

Why would it drag down anyone's internet connection? It's not hard to make it behave well and self-throttle, when internet connection is used for something else.


The missing piece is safe browser or app integration. If I link to a torrent movie I download some weird torrent file or get a magnet link, so open them I have to download some malware off sourceforge or whatever comes up when I google torrent downloader, and then when I finally get the files I get a bunch of crap with them, and maybe some malware.


Check out WebTorrent

Works today in Chrome, Firefox, edge, etc.

In Brave--Brendan Eich's new browser--you can even drop a magnet link directly into the URL bar! Uses WebTorrent under the hood.

https://webtorrent.io


It's not a problem if you have the expertise to know where to find a safe torrent client. But most people don't have that.


When you visit a site that uses webtorrent, then you don't need to find your own torrent client. The site's javascript does the torrenting in the user's browser automatically.


That’s just a symptom. The actual problem is that it’s hard(er) to make money off bit torrent, and possibly even p2p in general.

If there was a financial incentive every browser would have been capable of downloading files off BT as seamlessly as off HTTP.


I remember blizzard using this ~7 years ago for at least some of their downloads. Not sure if it still does.


They used it for WoW patches.


You might be interested in Day and the Beaker browser, which apply the same sorts of technology to web content.


Many ISPs throttle you when they see BitTorrent activity. It isn't a commercially viable transport.


Not sure about many, but most ISPs in the world definitely don't throttle torrents (or you for using torrents).


Agree I think that used to be the case when torrents were a major source of traffic- now it’s more likely to be a family with a couple of gamers, Netflix users etc who are using up all the bandwidth


it's really unfortunate. i would gladly pay $30/month or so for a legal service that did nothing other than replicate the current experience with torrent trackers. i would pay $100/month if every show/movie were guaranteed to be available in high bitrate 720p, 1080p, and (where applicable) 4K at the official airtime/release date, maybe throw in a few seedboxes for good measure.

i don't even care whether the video starts immediately. im perfectly happy to wait ~10mins to get a decent buffer of chunks at the beginning of the file.


It is used, for example by Blizzard to distribute their games.


Was. They use CDNs now.


I think it has a bad name and torrent traffic often gets blocked or throttled. I think with some rebranding it could be successful.


And encryption so your ISP doesn't get to see it is torrent traffic.


I could be wrong but I don’t think encryption helps in the case of BitTorrent?

Encrypted or not, BitTorrent traffic still looks like BitTorrent traffic.

The BitTorrent protocol has a peculiar connection pattern.


Please explain how its traffic pattern looks different than, say, an xbox360 game where one console hosts and the other consoles connect.

Or from a webrtc video call / ring [0] call.

Or a person downloading a few files from 10 to 20 different websites.

I don't see how my computer connecting to 10 ips is characteristic of bit-torrent and not of performing encrypted communication of another type with said ips.

[0]: https://ring.cx/


I believe I may have mentioned I could be wrong.

I'm not expert, but I can imagine how it might be possible to determine BitTorrent traffic from Xbox 360, voice call, or simultaneous downloads from 20 different websites, using flow analysis and some other data points, to a fairly high degree of certainty, in some cases.

In support of my amateur assessment I present the following Wikipedia entry on the subject:

"Some ISPs are now using more sophisticated measures (e.g. pattern/timing analysis or categorizing ports based on side-channel data) to detect BitTorrent traffic. This means that even encrypted BitTorrent traffic can be throttled. However, with ISPs that continue to use simpler, less costly methods to identify and throttle BitTorrent, the current solution remains effective.[citation needed]

Analysis of the BitTorrent protocol encryption (a.k.a. MSE) has shown that statistical measurements of packet sizes and packet directions of the first 100 packets in a TCP session can be used to identify the obfuscated protocol with over 96% accuracy.[22]

The Sandvine application uses a different approach to disrupt BitTorrent traffic ..."

I guess like everything, it's an arms race; and a sufficiently determined network monitor probably has the average BitTorrent user blocked. Might not be worth the effort though.

1. https://en.wikipedia.org/wiki/BitTorrent_protocol_encryption


>Please explain how its traffic pattern looks different than, say, an xbox360 game where one console hosts and the other consoles connect.

because you're maybe connecting to 100 players max, with relatively low bandwidth use

>Or from a webrtc video call / ring [0] call.

again, relatively low bandwidth

>Or a person downloading a few files from 10 to 20 different websites.

those websites run on port 80 or 443 whereas torrents use random ports > 1024, so that's a dead giveaway there. plus most people (even powerusers) don't have 10 to 20 parallel downloads from multiple sites. even if it's really someone downloading from 10 to 20, that probably puts them in the 99.99 percentile of bandwidth use, and they probably should be throttled anyways.


>if it's really someone downloading from 10 to 20, that probably puts them in the 99.99 percentile of bandwidth use, and they probably should be throttled anyways.

That's a really strange sentiment to me. I'm paying for 100mb/s, not for "100 mb/s, in certain specific circumstances over specific protocols". Ones and zeros, the rest of it is _my_ concern, not my ISP's.


Actually, what you're paying for is probably "bursts up to 100 mb/s" rather than "sustained traffic of 100 mb/s," whether or not it's marketed that way. ISPs commonly "oversubscribe" trunk lines dramatically. Back when I did trunk management at a big ISP in the dark ages of the internet (the late '90s), we oversubscribed at about a 5:1 ratio: basically, we sold 5 times as much bandwidth as the trunk actually had. As long as there was free bandwidth available you could get your full amount, but the amount you were guaranteed was one-fifth of that.

Of course, those were business lines and all those numbers were actually in the fine print of the contract. As far as I've been able to determine, if your residential ISP is only oversubscribing at a 10:1 ratio, you're pretty lucky (I've seen some reports from industry consulting firms that suggest 50:1 is more common), and the chances are they're not guaranteeing a minimum speed they can be held to.


There is a minimum speed guaranteed for most home subscribers but that is a fraction of dialup speed. Home routers do latency profiling to have a better connection stability; meaning even if you do not fill bandwith, all your requests will be delayed to smooth out until the buffer completely fills.

In most operators, limited bandwith users are oversubscribed and unlimited bandwith users are linked to dedicated channels. Oversubscription ratio is around 20:1 for DSL and 100:1 for mobile here in Turkey.


If you were really paying for 100 mb/s dedicated, guaranteed bandwidth, you would be paying thousands of dollars a month.

Edit: I’m getting downvoted, so I went and looked it up. It’s not “thousands”, but it’s close to $1000/mo, from the first provider I checked: https://imgur.com/gallery/9ZdlqXt


The handshake for establishing a bittorrent connection with a peer literally starts with the string "Bittorrent protocol". I don't think it's that hard for ISPs to detect that. Bittorrent encryption prevents that but it's not used everywhere.


It's simple, an HTTPS download is between you and the provider. It would be unprofessional for them to encourage a system which they do not control. Torrents could be unreliable, require the downloader to also upload by default, breach your privacy by announcing your IP to others, and may be throttled by your ISP.


Yet both Microsoft and Blizzard have decided P2P file distribution is just fine and dandy to use for their customers. It seems the only downsides are a slight complexity increase to handle torrent downloads for files that most of your customers need, with the upside that those same clients get their files faster thanks to the additional bandwidth gained using P2P tech like BitTorrent.

On the reliability side, you can easily include an HTTP web seed, though as long as the company offers the torrent on their fileserver, there isn't much benefit for doing that.

Windows 10 P2P Distribution for Updates: https://lifehacker.com/windows-10-uses-your-bandwidth-to-dis...

Blizzard/HTTP Web Seed: https://en.wikipedia.org/wiki/BitTorrent



That they did, but P2P distribution of files is still alive and well used in numerous industries: https://en.wikipedia.org/wiki/BitTorrent#Adoption


Blizzard torrents WoW updates. They are reliable because Blizzard can just serve downloads like normal as if they just happened to be over the BitTorrent protocol, and clients can join up and help out when they're able. The company can encrypt the payload to maintain a semblance of licensing and run the biggest seeders themselves.


I love BT and think it's great, but I always wonder:

Is it a net (for the bandwidth of whole internet) gain? I believe the major advantage is to transfer the cost of distrubutio away from the original host and distrubute it among those downloading IT. Does it reduce bandwidth used appreciably or just shuffle it around?


It also leads to more balance between upload/download. This is perhaps less good on typical broadband which usually has a much higher bandwidth for downloads but a typical datacentre will have a much greater upload demand to download (there’s a reason that downloading to an aws box is much cheaper than uploading).

On the other hand, big networks can be more efficient by caching things near the edge so that the total “bit miles” (I.e. the total over all packets sent of distance traveled * bits of data) are decreased.


I'd say BitTorrent is likely just as efficient or much more so at keeping distances short. At least, it optimizes by taking the fastest available peers. Your typical big torrent file could have thousands of peers. That's more endpoints than most any other big CDN and more likely to have a few peers really close to you. Heck, it could even download from someone on your local network if they happened to be serving up the file you are looking for.


For widespread distribution it’s a massive net reduction in traffic across internet backbones relative to centralized cervers. If files go from network A > B > C to two users on network C they can send packets to each other without going to networks A or B.

The protocol has quite a bit of overhead, so things can be worse among a small number of clients. It’s also worse than having caching servers on each network.


But people haven't used centralized servers to distribute large static data for years; they use CDNs which have massive capacity close to users.


If the data in the CDN is being downloaded by multiple users in each location then there would be that many local peers with it in a distributed network. If the peers give preference to e.g. lowest ping then they're equally efficient. If the data is unpopular then the CDN is less efficient because it transfers a copy to each node even if nobody ever wants it.


Networks are fractal in nature. Just became you have one or more CDN servers inside Comcast’s network does not mean you’re as close to every machine as two neighbors exchanging files.

From a business standpoint it does not matter much, but from a pure network perspective it’s likely worse.


CDNs are also centralized servers, you just have more of them.


Many large downloads (at least for some games) driven not through your browser will also by default have a peer to peer option enabled. Battle.net comes to mind for me.


Bit torrent is probably more absolute bits sent per byte downloaded than http, just because there's the management traffic.

Ignoring that, bit torrent is likely to do the bulk transfer over different links than a download from a central server. This seems likely to result in an overall higher number of bytes downloaded, but it's hard to model. It can also lead to congestion in parts of the network that were not normally congested.


doesn't it also help get around already congested routes?


Centralization will always be more efficient, generally speaking. Access to a centralized channel for distribution can be prohibitive for many reasons - cost and censorship being the foremost in my mind.


I'm doubtful of this premise, especially in the context of peering disputes. Here in the states, Netflix had/has serious issues with Comcast/Verizon/Centurylink not peering with their upstream providers sufficently to handle the number of video streams that needed to be carried.

This was not an issue with the last mile connection itself, but a middle mile peering problem. From the testing I have done, on certain carriers that play games like this (Deutsch Telekom in Germany, Centurylink, Verizon & others in the US, and most Asian/South American ISPs) torrenting updates to users is significantly faster than an HTTPS download from Amazon, Google Cloud & the like. Once you get a handful of seeders internal to the network, the other downloaders in that network start to get the torrent significantly faster than they would over traditional HTTPS.

If the content isn't extremely time sensitive (eg: live sports games, video calls) and is of a noteworthy size, using P2P tech to speed up downloads is not a bad idea.


Theoretically BitTorrent should be more efficient, since peers nearby can send data directly (e.g. within an ISP or regional node), whereas a central server will always have to transfer a longer distance and involve more infrastructure.

E.g. with my ISP, the bottleneck isn't in the last mile (that's all GPON fiber) but rather in the national network and their transit to the internet. If more transfer was made to local-to-local, it would reduce the amount of traffic that needs to go cross-country or out on the internet.


I don't think that's true.

Unless bittorrent understands the topology of a network, it will never be more efficient than a structure that takes advantage of the topology.

E.g. Netflix makes servers that can be distributed to the ISP's and can serve content to those consumers directly from the ISP itself, and not over the ISP's outgoing link.

E.g.#2 Multicast clients register to receive multicast streams and the routers know how to propagate multicast.


I think it can be done simpler than that. When the client/server has a choice of multiple peers to connect to, it can pick the one with the lower latency. That way you end up picking peers that are closer and will help lessen the load on the overall network. Not perfect, but a pretty good 90% solution


Latency is different than bandwidth.

I can make a very low latency connection but only delivers at most 1 packet every 10 seconds. It responds in 5 milliseconds say, but it won't let you have another packet for 10 seconds.

Or I can have a high latency connection that delivers a huge amount of bandwidth. It might take 10 seconds to start the firehose, but once it starts, it's a firehose. This is ironically the problem that the internet has with "Bufferbloat", and high bandwidth radio connections in space.

A topological solution would be counting the number of hops. There are internet protocols that do this, it's just that I don't think that BT takes advantage of it.


When bandwidth was a big concern bittorrent trackers prioritized peers geographically and ISPs were using retrackers to force peers within the same ISP to save bandwidth. It was very efficient, each torrent was downloaded pretty much just once for an entire ISP from the fastest sources. Good times.

But none of this is necessary, fast clients simply send more data and dominate download bandwidth. Doesn't matter why they are fast, whether they are simply very close or just on a path with more capacity. Everything is naturally efficient.


> Everything is naturally efficient.

Bittorrent is not an effective means of serving data from a 4G device. In fact, if anything the bandwidth on 4G arguably costs more than bandwidth from your Fiber ISP.

We use services like Youtube and instagram to push our data to others. And the efficiency from the phone perspective is the same as Multicast because the phone sends data exactly once over the upstream link.

You could say, "Well, duh, nobody should use bittorrent over 4G because of bandwidth issues." And that's exactly the point. If Bittorrent were so efficient, we'd be using it on 4G instead of sending it to youtube, etc.

You can't have it both ways, in other words, you can't say, "Bittorrent is the most efficient protocol man ever created, but only if you don't use it on 4g."

On the other hand, multicast with fountain codes (e.g. RaptorQ) sure seem to be a super efficient approach that might work everywhere.


> I can make a very low latency connection but only delivers at most 1 packet every 10 seconds. It responds in 5 milliseconds say, but it won't let you have another packet for 10 seconds.

That's fine. So you connect to that peer and get one packet every ten seconds. You also connect to the other 1000 peers with the lowest latency, some of which are likely to be faster and in any event will be fast in aggregate, and none of which require traversing intercontinental fiber.


> Netflix makes servers that can be distributed to the ISP's

Distributing servers to every ISP doesn't seem very efficient. It may work in the US where I you can count the number of ISPs on one hand, but where I live there are 100+.

> Multicast

Has multicast ever worked?


> Distributing servers to every ISP doesn't seem very efficient.

Actually that's the way most CDN's work. Instead of having 40 hops to your customer, you may only have to have 10 because the first thirty hops are from you to your server.

Less congestion issues, higher uptime, lower latency and faster bandwidth usually.

> Has multicast ever worked?

In my experience with military networks, yes. Especially when one multicast packet can be received by 20 listeners without repeating the packet 20 times on the physical layer (especially radio links).

Sending data via bittorrent would saturate links you wouldn't want saturated.


IPTV services are usually based on multicast and seem to work quite efficiently. But it's obviously useless for services like Netflix/Youtube/etc. because you have to be sending the same content to everyone at the same time.


No, multicast for IPTV is not cost efficient for ISPs. I don't think there was ever a time when multicast was anything more than hardware vendors trying to profit off of ISPs with broken promises of efficiency.


The client can understand and choose based on that. Back when we had different caps for international/national/in-isp traffic, there was a fork of eMule which intentionally prioritized downloading from IPs in ranges from the same ISP, and then country, before pulling from outside.


It doesn't have to understand the topology, it can just understand which peers can serve bytes fastest.

(I don't know if it does this, but it's been around long enough that I'd be surprised if it doesn't).


That doesn't make it more efficient. It might make it faster at the expense of using your ISP's outgoing bandwidth multiple times over.

The most efficient network distribution is one where the data only travels each network link once for all destinations. This is the way multicast works, e.g.


Has it really been that long since the fall (at least of major popularity) of Napster, direct connect, Kazaa, lime wire, audio galaxy etc? I believe (please correct me) that both Blizzard, Microsoft and many other major software firms distrubute software using BitTorrent, in addition to all Linux distros.



> "The Fanimatrix" is a fan-made, zero-budget short film set within the Matrix universe, specifically shortly before the discovery of "The One" (i.e. the first "Matrix" feature film). It tells the story of two rebels - Dante and Medusa - and of their fateful mission onto the virtual reality prison world that is The Matrix.


It's actually pretty good!


And the magnet link... magnet:?xt=urn:btih:72C83366E95DD44CC85F26198ECC55F0F4576AD4 ;-)


A few years back, there was a story about how a different Matrix fan work, The ASCII Matrix, was the oldest torrent: https://motherboard.vice.com/en_us/article/kb7wex/how-ascii-...

(And this all makes me feel old.)


The funny thing is, it was in the HN comments about the source of that story (https://news.ycombinator.com/item?id=10962253) that was revealed that the Fanimatrix torrent was still being seeded.


That's fascinating! I also was still under the impression that the ASCII Matrix was the oldest surviving torrent and even started seeding it, back when I first heard about that story. Great to see such connections being made; that's what the internet is, or should be, all about.


There seems to be a link between Matrix fans and torrent technology fans.


I hope no one here finds this a shocking revelation.


I uploaded one torrent in 2004 and it is still alive with seeders.


> I uploaded one torrent in 2004 and it is still alive with seeders.

What do you mean by “uploading”? You seed torrents, you don’t upload them (where?).


I meant 'created' one torrent. Where? TPB.


Torrent link?


It makes me wonder what the actual very first widespread torrent was (even if it has no more seeds) --- my guess is some Linux distro.


See also:

https://torrentfreak.com/oldest-torrent-is-still-being-share...

Another Matrix-related torrent from yesteryear.


> With a limited budget of just $800, of which nearly half went into a leather jacket

This is my favorite bit :)


legitimate question: why not upload it to YouTube?

You can still have the torrent link in the description if people want to download it, you will reach a wider audience and you might actually make some money off it to fund future fan projects.


or, it might be arbitrarily demonetized or deleted, or have a a spurious takedown put on it


unlikely, it's independent work that may contain some copyrighted designs but not anything copied directly of the original film, so the content-id won't detect it. star trek fan films also stay on.

you just can't/shouldn't monetize it because that would motivate the owners of the original film to stop you.

greetings, eMBee.


>unlikely, it's independent work that may contain some copyrighted designs but not anything copied directly of the original film, so the content-id won't detect it.

That is incredibly naive.

Content ID has been picking up things as innocuous as applause at the end of an entirely novel recording as being copyright infringement.


do you have a reference for that? (not that i don't believe you, but i'd like to read that story and share it as a good example how bad this system is)

given the amount of star trek fan-films on youtube (there are thousands, literally) spurious takedowns would be noticeable. so i don't think i am naive here. as long as all your sound and images are original or from a creative commons source a takedown should be unlikely. (that doesn't mean it can't happen though)

greetings, eMBee.


> do you have a reference for that? (not that i don't believe you, but i'd like to read that story and share it as a good example how bad this system is)

Sure: https://twitter.com/mormolyke/status/1011637522324127745

There's also examples like this, about a Star Wars fan film, where Warner claimed that silence infringed on their music copyright! https://www.wired.com/story/the-star-wars-video-that-baffled...


The video was released in 2003. YouTube went live in 2005.


It had 49 seeders when I downloaded it just now.


I feel so old.


The title should technically be "World's Oldest Currently-Alive Torrent Still Alive After 15 Years". The first part is pretty much a tautology. This doesn't mean that the first torrent created happens to still be seeded.


I think it's reasonable to take that bit as implied. Making it explicit leaves the title sounding a bit weird.

Edit: since people are continuing to comment about this, I've added the word "surviving" above.


Torrents are only 15 years old?? Wow, I could have sworn they came before that. I feel like I've been doing torrents forever.

Windows XP is older than torrents, but in my mind it feels much newer for some reason. I wonder why.


I think it's just that this is oldest surviving torrent. Torrents/torrent software came out around 2001, which is more than 15 years.


I have a screenshot somewhere of my XP machine running DU meter & Napster to show off my brand new, 2nd in my county, cable modem pulling 400KB/sec. I felt like a God.


We only had dial-up, but I "borrowed" an EGPRS phone from work. 177.6 kbps (I think) felt more than you could ever need! Making it work in Slackware Linux was a challenge.


If you give this to someone today he will sue you for emotional harm.


Napster/Limewire predate it a bit.


Limewire, that takes me back.

I alternated between using Limewire and Bearshare back in the day. Vague memories but somehow their UI stuck with me.

Makes me even more happy we have Spotify nowadays :)


Title is misleading. Really it means to say that the oldest still active torrent is 15 years old.


No it's not. If you talk about the world's oldest woman, no one expects you to include Cleopatra in the running.


Well, I for one thought it meant the first ever torrent is still going. Why wouldn't people assume 'world's oldest torrent' means just that?! And 'the world's oldest running torrent is still running' is like 'world's oldest living human is still alive' - of course they are.


Yeah I thought the same.


P2P is already used for hot cache fill in ISP hosted blackboxe cache servers of services like Microsoft, Netflix, Google.


It's alive because original author keep seeding the content. Nothing headline worthy to read.


That's not correct, see this Hacker News comment and the reply by the creator:

https://news.ycombinator.com/item?id=10962253




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: