Hacker News new | past | comments | ask | show | jobs | submit login

The issue is that you can retrieve the prompt with even a low success rate.

You can make prompts where both the prompt itself and the answer is encrypted and GPT-3 struggles with this so the detector may decrypt the prompt or response to something else than what is answering the prompt.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: