Hacker News new | past | comments | ask | show | jobs | submit login

A trillion times the age of the universe? How quaint; you're using durations that humans can, in principle, conceive of.

---

Edit: I'm being ridiculous; the below assumes a Turing machine with infinite tape, with the restriction that it must halt eventually, simulated by a PC. That's not real.

Absent an infinite Turing tape, the number of states a computer can have is very finite; 2^(amount of memory). Assuming 16GiB of RAM and no hard drive, that's 2^2^30 states. Assuming a state transition happens every clock cycle (which it basically would be for a naïve counter – though obviously if you were designing a program to run as slowly as possible, you wouldn't do it this way), and assuming a massively-overclocked 10GHz chip (2^33 Hz or thereabouts), it'd run for 2^(2^30 - 33) seconds.

The CMB will probably be undetectable in about 2^62 seconds, so it'd run for 2^(2^30 - 95) times that length of time. Vastly, vastly shorter than my below estimate!!

---

https://waitbutwhy.com/2014/11/1000000-grahams-number.html

Graham's number is g_64. Don't take g_1000; rather, take g_g_g_g_g_g_g_…g_g_64 Graham's number times, and then repeat that Graham's number times, and that's probably not an upper bound; the complete description of that number fits in your computer's memory.

Whatever the limit of a computer program's execution time, it's definitely large enough that it doesn't really need a unit; it's practically irrelevant whether you measure it in yocto-Planck-times or yotta-heat-deaths, or even just count the digits.




> How quaint; you're using durations that humans can, in principle, conceive of.

You'll have to forgive me I'm afraid - I was writing for an audience of mostly humans :)




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: