Exactly. Taken to the limit if you extrapolate how many future human lives could be extinguished by a runaway AI you get extremely unsettling answers. Like the expected value of a .01% change of extinction from AI might be Trillions of quality Human lives. (This could in fact be on the very very conservative side, e.g. Nick Bostrom has speculated that there could be 10^35 human lives to be lived in the far future which is itself a conservative estimate). With these numbers even setting AI risk to be absurdly low, say 1/10^20, we might still expect to lose 10 billion lives. (I'd argue even the most optimistic person in the world couldn't assign a probability that low) So the stakes are extraordinarily high.
Actual people are vastly outnumbered by entropy fluctuations in the long run, according to any number of valid cosmological models. Their total population is greater than 10^35 by far, and does not depend on whether superintelligence is possible or likely to destroy the actual human race. That question makes no difference to almost every human who will ever have a thought or feeling.
You could say, well, that's just a theory -- and it is a dubious one, indeed. But it's far more established by statistics and physical law than the numbers Bolstrom throws out.
The last gas I passed had 10^23 entropy fluctuations in it. So you're saying those are more important than the 100 Billion human lives that came before them?
I'm saying that it's unscientific nonsense. Bolstrom has predicted that the number of people potentially in the future is between approximately 0 and 100,000,000,000,000,000,000,000,000,000,000,000. This isn't an estimate that can or should factor into any decision at all. 100B people are going to live in the 21st century. Worry about them.
https://globalprioritiesinstitute.org/wp-content/uploads/Tob...