Hacker News new | past | comments | ask | show | jobs | submit login

...yes, performance will drop when your set of active data exceeds available memory and your system needs to reference slower storage. I assume this would apply to anything.



but that threshold is surely not generally 20M rows? I mean wouldn't it be completely dependent on how much memory you have?


it's obviously based on the data size itself, memory available etc etc.

Pretty clearly people ran the exact same test, fill a table with a bunch of empty/meaningless data and see where performance degrades - then write a catchy headline title/post.

In the end, it's more nuanced than that but i think the overall theme is know your tuning once you get to certain data sizes.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: