Hacker News new | past | comments | ask | show | jobs | submit login

The number of apps needed for it to matter can be measured in the single digits for there to be demand to pay for it, assuming it's the right single digits...

It's quite possible - even likely - that low end systems will not be 128 bits anytime soon. 8 bit microprocessors are still selling in large volumes for embedded use. But we're about to see 64 bit entering phones this year, because it is becoming necessary, or at least more convenient than not.

All the evidence is that we're not heading towards a slowdown in growth in storage requirements anytime soon. If anything, the existence of super-computers that are spread over huge clusters instead of being a single, tightly integrated system indicates that there is some level of demand for systems several orders of magnitude larger than the current largest off the shelf systems even today, at the right price.

The top end of the market have increased by at least 10 bits over the last 11-12 years alone. At the current storage growth rates, we'll hit the 64 bit limit on single server systems sometime in the next 10-20 years when factoring in memory mapped IO; sooner for single-system-image clusters.




I don't know. That address space is very much larger than all the storage every produced in human history.


> I don't know. That address space is very much larger than all the storage every produced in human history.

Not a chance. 64 bit is ca 16 exabytes.

Currently, the shipping volume of harddrives is about 500 million units/year. If we're generous and say that their average storage size is only 100GB, despite the large number of models in the 1TB-5TB segment, then that's 50 million TB/year, or about 50 exabytes of harddrive capacity per year. In reality it's likely much higher, and rising rapidly.

Yes, the number sounds big, but so did 1TB just a few years ago. And 1GB just a few years before that. It's not that long ago we were marvelling over even being able to buy 20MB hd's for home use. The number may sound outrageous, but my experience based on actual product availability is that we should expect a factor of 1000+ rises in storage capacity per 10-15 years, and I see no evidence to justify a slowdown.

And increases in capability causes changes in how we engineer things. When petabyte sized databases becomes possible for more people at reasonable price points, you'll see a lot of people that previously "made do" with terabyte sized databases find all kinds of uses for extra analysis etc., or simply storing more intermedia stages and being more wasteful because we can.


AH - counting disk space. Yes, if you mapped all that into one computer's address space you can make the math work out.


I've made the point several times that people do in fact expect to be able to mmap() files far bigger than physical memory on larger systems.

Purely for RAM we can survive with 64 bit for maybe a decade extra.


That's not true at all... A 64 bit system can address up to 16 exabytes of data... Google's datacenters (in 2013) estimated data storage was 15 exabytes.


And you could say the same about 32 bits when the NORD-5 was introduced.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: