Hacker News new | past | comments | ask | show | jobs | submit login

Note that it's 60TB compressed

Mostly it's about speed. With multiple machines you can bring more cores to bear for the processing, and have more RAM to cache partial results. Postgres could certainly do the job, but I'd be surprised if it would run within an order of magnitude of these results.




IO is pretty huge as well. You can spend lots and lots and lots of money to buy 1 machine a hard drive that can read 60 TB fast. Or you can have 100 machines with the cheapest possible hard drive and smoke the total IO.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: