Hacker News new | past | comments | ask | show | jobs | submit login

You can buy a machine with 128GB of memory and 2 TB of disk space for under $10k at Dell. A billion rows could be an in-memory dataset now.



But you need to plan for data growth.


If your goal is write-once archival and subsequent query of large volumes of data, e.g., log/event data, you can't do much better than some of the column-oriented databases. Two examples are SenSage and Vertica.

See customer profiles for SenSage at http://sensage.com/customers/customer-profiles.php. Various customers collect and store up to 200GB of raw text log files (400 million records at generous 500bytes/line estimate) broken into records ... PER DAY ... with 2 to 5 or even more years of data online available for analysis ... fast analysis ... one example is an 8-billion-record query in 2 minutes.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: