Hacker News new | past | comments | ask | show | jobs | submit login

> And even for humans, we have mechanisms to control their output when they get confused.

What mechanisms do you mean? I don’t think it’s feasible to use hunger and fear of dismissal to control an instance of an LLM.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: