Hacker News new | past | comments | ask | show | jobs | submit login

> but they may do a hell of a job destroying society.

Agreed. I feel they might actually become so disruptive that they interrupt the progress towards AGI/ASI.

When it comes to the actual topic of alignment for AGI/ASI. Under the current premise, it appears to be an unsolvable paradox for more reason than I can list here. I've written on that in more detail as well, FYI - https://dakara.substack.com/p/ai-singularity-the-hubris-trap




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: