Hacker News new | past | comments | ask | show | jobs | submit login

It seems like the blog post answers your question pretty thoroughly. The Hyperdrive index and the protocol are tuned for this use case, making it scale to being able to host a Wikipedia clone. BitTorrent FS + SQlite are not tuned for this use case.



Wikipedia’s text history absolutely fits on a tiny hard drive and is easy to get a replica of.


Compressed with 7-Zip, sure, but uncompressed, the entire thing takes up 10TB. The Hyperdrive post doesn't mention compression at all, so the comparison should be without it.

> As of June 2015, the dump of all pages with complete edit history in XML format at enwiki dump progress on 20150602 is about 100 GB compressed using 7-Zip, and 10 TB uncompressed.

From: https://en.wikipedia.org/wiki/Wikipedia:Size_of_Wikipedia#Si...

It's a bit of a nitpick either way because you're right, Wikipedia may not be the best example because 10TB is still relatively small.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: