Hacker News new | past | comments | ask | show | jobs | submit login

It seems like your nuclear analogy might have an answer to the question. Take that theory that it could have been possible the nuclear chain reaction ignited the atmosphere.

It seems like the worst-case predictions about AI are at that scale. There is also the possibility that AI causes problems on a much smaller scale. A country’s weapons system goes terribly wrong. A single company’s business goes berserk. Stock trading bot at a large enough bank/fund causes a recession. Stuff like that seems rather likely to happen as systems are adopted before they’re “ready” and before they’re anything close to AGI.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: