Hacker News new | past | comments | ask | show | jobs | submit login

That was kind of my thought as well... I worked on a small-mid sized classifieds site (about 10-12 unique visitors a month on average) and even then the core dataset was about 8-10GB, with some log-like data hitting around 4-5GB/month. This is freakishly huge. I don't know enough about different platforms to even digest how well you can even utilize that much memory. Though it would be a first to genuinely have way more hardware than you'll likely ever need for something.

IIRC, the images for the site were closer to 7-8TB, but I don't know how typical that is for other types of sites, and caching every image on the site in memory is pretty impractical... just the same... damn.




I think you're missing a unit. 10-12 thousand? million?


million... lol




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: