Hacker News new | past | comments | ask | show | jobs | submit login

Nobody told ChatGPT to lie all the time, but it keeps doing it.



I would argue ChatGPT confabulates, it doesn't lie. A lie IMO has to include an deception of some kind.

EDIT: I'd also argue that GPT4 itself can be told to actually lie, because you can tell it to substitute a true statement for a false one, then later get it to reveal the true the statement.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: