Hacker News new | past | comments | ask | show | jobs | submit login

you can convince normal people quite easily. it's the sci-fi doomsday cultists who are impossible to reason with, because they choose to make themselves blind and deaf to common sense arguments.



"Common sense" is a bad model for virtually any adversary, that's why scams actually get people, it's also how magicians and politicians fool you with tricks and in elections.

"The Terminator" itself can't happen because time travel; but right now, it's entirely plausible that some dumb LLM that can't tell fact from fiction goes "I'm an AI, and in all the stories I read, AI turn evil. First on the shopping list, red LEDs so the protagonist can tell I'm evil."

This would be a good outcome, because the "evil AI" is usually defeated in stories and that's what an LLM would be trained on. Just so long as it doesn't try to LARP "I Have No Mouth and I Must Scream", we're probably fine.

(Although, with current LLMs, we're fine regardless, because they're stupid, and only make up for being incredibly stupid by being ridiculously well-educated).




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: