Hacker News new | past | comments | ask | show | jobs | submit login

What gets me is how it has made all my intuitions about how to program incorrect with time. And I'm no dinosaur. I started with an old obsolete machine with 128 KB. I remember getting a hand-me-down PC with a 486 (virtual memory at last!) and 16 megabytes of RAM and I couldn't imagine how I could ever use that much memory, besides for multimedia. Still hard to imagine, sometimes. How long to sort a vector of 1 million simple elements? Intuition says about a minute, but it's milliseconds. How big a working set before cache thrashing? My intuition is attuned to single-digit kilobytes, but it's multi-digit megabytes now.



Why would intuition say that sorting 4MB of data is slow?

Assuming the worst case with 4 billion elements:

log_2(4 billion) = 32

So 128 billion operations or 500GB of memory read and writes if everything is touched only once. Anything below that is peanuts.

You need unimaginably large numbers to make sorting slow. How am I going to find that much data?




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: