Hacker News new | past | comments | ask | show | jobs | submit login

What if compute capability became so powerful, that the easiest way to store something was to create it, store it for a millisecond, and then simply record the exact place and time since the big bang that the record was stored, as a comparably short set of coordinates, and then delete it. Then accessing the file is as simple as simulating the universe up until that millisecond and reading the file.



Or we just store the hash and then brute-force it every time we need the original content.


Probably easier to just have every computer connected on a much faster network. Query the internet for a hash, have it sent to you after a 100ms lookup at 1TBps.

Seems impossible considering in my lifetime I’ve basically seen ISP speeds go from 5/1 on my first cable connection to 20/768 20 years later on DSL. I always find it a bit weird knowing how important the internet is to commerce and the economy and my only real way to access it is through the same phone line that has been around forever.


That'd be lossy with hash collisions, though.


You are right, although we might be able to make it lossless in practical terms with probabilistic approaches like combining multiple algorithms, selecting the hash length, and using heuristics to pick the most likely source.


I think this assumes a deterministic universe. I'm not sure if that's the case.


DRM will ruin it.


Isn't Heisenberg's uncertainty principle actually some kind of DRM on the data that we can measure?




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: