Hacker News new | past | comments | ask | show | jobs | submit login

Mind explaining the origin of your 32 TB figure? I must be missing something enormous, but as far as I can tell the SciMag database dump is 9.3 GB, the LibGen non-fiction dump is 3.2 GB, and the LibGen fiction dump is 757 MB. That's a pretty huge divergence.

Source: http://gen.lib.rus.ec/dbdumps/




Oh, wait. I'm dumb. I see that your first link is a citation.

Continuing to be dense, why is there a difference between their "database dump" and the total of all the files they have?


The databases contain the metadata (authors, edition, ISBN, etc.) for the books.

Thus, 32 TB of books (over 2 million titles), 3.2 GB database.


Ah, that makes sense.

To make sure I'm understanding this correctly:

The Libgen Desktop application (which requires only a copy of the database) would then use the DB metadata to make LibGen locally searchable, and would only retrieve the individual books/papers on request?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: