It seems like the blog post answers your question pretty thoroughly. The Hyperdrive index and the protocol are tuned for this use case, making it scale to being able to host a Wikipedia clone. BitTorrent FS + SQlite are not tuned for this use case.
Compressed with 7-Zip, sure, but uncompressed, the entire thing takes up 10TB. The Hyperdrive post doesn't mention compression at all, so the comparison should be without it.
> As of June 2015, the dump of all pages with complete edit history in XML format at enwiki dump progress on 20150602 is about 100 GB compressed using 7-Zip, and 10 TB uncompressed.